Risk Warning: Beware of illegal fundraising in the name of 'virtual currency' and 'blockchain'. — Five departments including the Banking and Insurance Regulatory Commission
Information
Discover
Search
Login
简中
繁中
English
日本語
한국어
ภาษาไทย
Tiếng Việt
BTC
ETH
HTX
SOL
BNB
View Market
What's next in computing?
W3.Hitchhiker
特邀专栏作者
2021-12-27 03:47
This article is about 5723 words, reading the full article takes about 9 minutes
The eighth part of a16z founder Chris dixon's personal experience.

image description

(New computing eras have occurred every 10–15 years)

image description

(Financial and product cycles evolve mostly independently)

Personal computers enabled entrepreneurs to create word processors, spreadsheets, and many other desktop applications. The Internet has enabled search engines, e-commerce, email and messaging, social networking, SaaS business applications, and many other services. Smartphones enable mobile information, mobile social networking, and on-demand services such as car sharing. Today, we are in the middle of the mobile era. It's likely that many more mobile innovations are yet to come.

Each product era can be divided into two phases. 1) the gestation phase, when a new platform is launched but is expensive, incomplete, and/or difficult to use; 2) the growth phase, when a new product emerges that solves these problems and pulls The prelude to the period of exponential growth began.

image description

(PC annual sales (thousands), source:http://jeremyreimer.com/m-item.lsp?i=137

The gestational phase of the internet took place during the80s and early 90simage description

(Web users worldwide, source:http://churchm.ag/numbers-internet-use/

There were feature phones in the 90s and early smartphones like the Sidekick and Blackberry in the early 2000s, but the growth phase of smartphones really started in 2007-8 with the release of the iPhone and then Android. Since then, smartphone adoption has exploded: about 2B people own a smartphone today. By 2020,80% of the world's populationimage description

(Annual worldwide smartphone sales (millions))

If the 10-15 year pattern repeats itself, the next era of computing should enter its growth phase in the next few years. In this case, we should already be in the gestation stage. There are some important trends in both hardware and software that give us a glimpse of what the next era of computing might be. Here, I touch on these trends and then offer some suggestions for what the future might hold.

Hardware: small, cheap, ubiquitous

image description

(Computers are steadily getting smaller, source:http://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338

We are now entering an era where processors and sensors have become so small and cheap that there will be far more computers than people.

Moore's LawMoore's Law). The second is Chris Andersonsaid "The peace dividend of the smartphone war": The massive success of smartphones has led to massive investments in processors and sensors. If you disassemble a modern drone, VR headset, or IoT device, you'll find mostly smartphone components.

In the modern semiconductor era, the focus has shifted from standalone CPUs to specialized chips known as systems-on-chipsimage description

(Computer prices are steadily falling, source:https://medium.com/@magicsilicon/computing-transitions-22c07b9c457a#.j4cm9m6qu\

image description

(Raspberry Pi Zero: 1 GHz Linux computer for $5)

This new architecture reduces the price of a basic computing system from about $100 to about $10.Raspberry Pi Zerois a 1GHz Linux computer that you can get for $5. For a similar price, you can buy aMicrocontroller with wifi support, to run a version of Python. Soon these chips will cost less than a dollar. Embedding computers into everything will be cost-effective.

route maproute mapimage description

(Google’s quantum computer, source: https://www.technologyreview.com/s/544421/googles-quantum-dream-machine/)

One wildcard technology is quantum computing, which today exists mostly in laboratories but, if commercialized, could bring order-of-magnitude performance improvements to certain classes of algorithms in fields such as biology and artificial intelligence.

Software: The Golden Age of AI

andHadoopandSparkSuch systems are used for parallelization of big data problems, and for securing data and assets on the Bitcoin/Blockchain.

prophecyprophecy, by the year 2000 machines will be able to successfully imitate humans. However, there are also good reasons to think that AI may now finally be entering a golden age.

"Machine learning is a core, transformative way in which we are rethinking everything we do."- Sundar Pichai, Google CEO

Much of the excitement in artificial intelligence has centered on deep learning, a machine learning technique pioneered by the now famous 2012 Google projectto promote, a project that uses a massive computer cluster to learn to recognize cats in YouTube videos. Deep learning is the offspring of neural networks, a technique that cantrace backcombinationcombinationimage description

(ImageNet challenge error rates,source:http://www.slideshare.net/nervanasys/sd-meetup-12215 (red line = human performance)

We're tempted to think of deep learning as just another Silicon Valley buzzword. However, this excitement is backed by impressive theoretical and real-world results. For example, before using deep learning,ImageNet Challenge- a popular machine vision competition - the winner has an error rate between 20-30%. Using deep learning, the winning algorithm steadily improved in accuracy and surpassed human performance in 2015.

Many papers, datasets, and software tools related to deep learning are open source. This has a democratizing effect, enabling individuals and small organizations to build powerful applications. WhatsAppOnly 50 engineers can build a service for 900 million usersof global messaging systems compared to the thousands of engineers required for previous generations of messaging systems. this kind"WhatsApp effect "andTheanoandTensorFlowSuch software tools, combined with cloud data centers for training, and cheap GPUs for deployment, enable small teams of engineers to build state-of-the-art AI systems.

For example, here, aindividual programmerimage description

(Left: Black & White; Middle: Auto Colorization; Right: True Color. Source:http://tinyclouds.org/colorize/

And here, a small startup created a real-time object classifier:

This of course brings to mind a famous scene from a sci-fi movie:

One of the first applications of deep learning released by a major tech company was the search function in Google Photos, and it's so smartShocking

We will soon see a massive upgrade in the intelligence of a variety of products, including: voice assistants, search engines,chatbot、3Dscanner, language translators, cars, drones, medical imaging systems, and more.

The business plans of the next 10,000 startups are easy to predict. Take X for example, plus artificial intelligence. That's a big problem, and it's here now. -Kevin Kelly

Startups building AI products will need to maintain a laser focus on specific applications to compete with big tech companies that have made AI a top priority. AI systems get better as more data is collected, which means it is possible to create adata network effectsuseuseData network effects, producing better maps than his own better capitalized competitors. Successful AI startups will followakinStrategy.

Software + Hardware: The New Computer

There are all sorts of new computing platforms in the pipeline, and over time they'll get better -- and possibly enter a growth phase -- as they incorporate the latest advances in hardware and software. Although their designs and packaging are very different, they share a common theme: they bring us new, enhanced capabilities by embedding an intelligent virtualization layer on top of the world. Here's a brief overview of some of the new platforms:

Cars.Big tech companies like Google, Apple, Uber, and Tesla are devoting a lot of resources to self-driving cars. Semi-autonomous vehicles like the Tesla Model S are already available to the public and will improve rapidly. Fully autonomous driving will take longer, but probably not more than 5 years. There are already fully autonomous cars that are almost as good as human drivers. However, for cultural and regulatory reasons, fully autonomous cars may need to do a lot better than human drivers before they are widely allowed.

startstartTake autonomous driving very seriously. You'll even see some interesting products made by startups. Deep learning software tools have gotten so good that aindividual programmerAble to make a semi-autonomous car:

Drones.Today's consumer drones contain modern hardware (mainly smartphone components plus mechanical parts), but relatively simple software. In the near future, we'll see drones that incorporate advanced computer vision and other artificial intelligence to make them safer, easier to fly, and more useful. Casual videography will continue to be popular, but there will also be importantBusinessExample. There are tens of millions ofDangerWork, which involves climbing buildings, towers, and other structures, can be performed more safely and efficiently using drones.

Internet of Things (Internet of Things).andNestandDropcamimage descriptionEcho

(Three main use cases of IoT)

Most people think the Echo is a gimmick, and aren't impressed with how useful it is until they try itsurprisedemodemo, illustrating how effective always-on voice can be as a user interface. It will be a while before we have bots with general intelligence capable of full conversations. But, as the Echo shows, today's voice can succeed in limited settings. As recent breakthroughs in deep learning make it into production devices, language understanding will rapidly improve.

IoT will also be adopted in a business context. For example, devices with sensors and network connections are very useful for monitoring industrial equipment.it works

Wearables.Today's wearable computers are limited in several ways, including battery, communication, and processing. Those wearables that have been successful have focused on narrow applications, such as fitness monitoring. With the continuous improvement of hardware components, wearable devices will support rich applications like smartphones, unlocking a wide range of new applications. Like the Internet of Things, voice will likely become the primary user interface.

Virtual Reality.2016 was an exciting year for VR:Oculus Riftand HTC/ValveVive(and possibly the Sony Playstation VR), which means comfortable and immersive VR systems will finally be available to the public. VR systems need to be really good to avoid"uncanny valley"trap. Proper VR requires special screens (high resolution, high refresh rate, low persistence), powerful graphics cards, and the ability to track the user's precise position (previously released VR systems could only track the rotation of the user's head). This year, the public will experience the so-called"

VR headsets will continue to improve and become more affordable. Major areas of study will include. 1) create a render and/ortaking picturestracktrackand scanning machine vision, and 3) hosting largevirtual environmentsystemsystem

Augmented Reality.image description

(Real and virtual combined (from *The Kingsmen*))

What's next?

What's next?

It is possible that the 10-15 year computing cycle model is over and mobile is the last era. It's also possible that the next era won't arrive for some time, or that only a subset of the new computing categories discussed above will end up being important.

I tend to think that we are on the cusp of not one but many new eras. "The peace dividend of the smartphone war"A Cambrian explosion of new devices was created, and developments in software, especially artificial intelligence, will make these devices smart and useful. Many of the futuristic technologies discussed above exist today and will be widely available in the near future.

Observers have noticed that many of these new devices are in the"awkward puberty". This is because they are in the gestation stage. Like personal computers (PCs) in the 70s, the internet in the 80s, and smartphones in the 2000s, we're seeing fragments of a future that hasn't quite arrived yet. But the future is here: markets go up and down, excitement goes up and down, but computing technology is advancing steadily.

founder
Welcome to Join Odaily Official Community