Hackernoon logoTime Travel Through 2010s Technology: Part 1 by@geekonrecord

Time Travel Through 2010s Technology: Part 1

Author profile picture

@geekonrecordGeek on record

Commentary on technology

We are now close to the end of an important decade for technology, a decade that started without many of the innovations that today we consider part of the norm. Artificial intelligence at home, self-driving cars, wearable devices, supercomputers in our pockets… the 2010s not only changed the technology we use, but also how we communicate and think. Privacy has never been so critical as a selling point, and information bubbles have never been so polarizing. Today, we are at a turning point in the tech industry; it’s not clear what’s going to be the next revolutionary tech segment, or how companies are going to keep convincing customers to upgrade their various devices.
So what happened in the last ten years? How did we get here? The following areas have experienced substantial changes since 2010, making our lives considerably better in some cases, while taking a few surprising turns in some others. This is the first of a series of two posts that take a look back at a decade of tech evolution.

Operating systems

The mobile OS wars were in full swing in 2010: Microsoft was preparing to go all-in with Windows Phone 7, while Apple’s iOS and Google’s Android were fighting for the mobile market with old contenders like Nokia’s Symbian.
10 years later, iOS and Android are clear winners in the mobile market, and the operating system is dwindling in importance. Companies are now focused on cementing their market share by selling, or often giving for free, services that will hook customers to their respective ecosystems. Even Microsoft has abandoned its once Windows-centric mentality for a services-first approach, where their officially recommended mobile solution is an Android smartphone filled with Microsoft software.
In between those two points in time, we saw the rise and death of Windows Phone, which arguably started with the wrong foot: by attempting to move the focus away from apps and onto aggregated hubs integrated within the OS. It soon became clear that it was impossible to keep up with all the new features that developers were bringing onto their apps, which created an “app gap”. This eventually became Windows Phone‘s ultimate Achilles heel.
We also saw Apple reduce the distance between MacOS and iOS thanks to iPadOS, bringing the iPad closer than ever to becoming a decent alternative for those who make a casual use of their personal computer. Given that the iPad didn’t exist until early 2010, it’s impressive to see how software changes have made the device evolve from a “big iPhone” to a “laptop replacement”.
In 2010, the desktop situation wasn’t that different from what consumers had seen during previous years; Windows 7 was a big hit and Microsoft was confident that their new touch-first user interface would make Windows 8 a successful reinvention of the classic operating system.
A decade later, Windows 10 is getting updates over the air once or twice a year, providing some sort of Windows as a Service customer experience, that makes it resemble a mobile OS. Windows 10 is today more of a vessel for Microsoft services like Skype, Office, Outlook and Bing. Pushing the envelope on operating systems is no longer a priority for Microsoft —anyone who wrote that sentence in 2010 would have been considered a fool.

Personal computing

Smartphones were already getting useful in 2010. Apple’s App Store, which launched two years before, had 225,000 apps available by then. This had such a profound impact on the personal computing space that “app” was awarded the honor of being 2010’s “Word of the Year” by the American Dialect Society. This meant smartphones were becoming very popular for the average consumer. Their sizes were considerably smaller when compared to 2019 standards: a regular display size was between 3.2 and 3.7 inches.
Technology wasn’t what it is today; it was still common to carry a compact camera when you were going on a vacation due to the limitations of our phones. In 2010, the iPhone 4 offered a 5-megapixel camera for $299, whereas a regular Sony Cyber-Shot had a 12.2-megapixel sensor for the same price. A smartphone might have been okay for casual photos, but there was still a sizeable market opportunity for compact cameras.
Today, the iPhone 11 Pro has a 12-megapixel camera that seamlessly integrates with advanced image processing software, making pictures look practically professional. Seeing a compact camera in the wild is a rare sight these days. Smartphones have become true supercomputers since 2010. There is almost no task that cannot be done on a smartphone: gaming, video editing, professional photography, writing… you name it. 
And when the screen size is too small for some tasks, tablets are here to save the day. Introduced in January 2010, the iPad redefined the tablet segment in a way that few expected. It all started with a disappointing “big iPhone” without the ability to multitask, but today’s iPads are devices that can replace traditional computers. In fact, I recently replaced my laptop with an iPad Pro and haven’t looked back.
Apple was not the only company in the tablet market though. Microsoft tried to differentiate itself in 2012 by making a tablet with support for mouse and keyboard, the Surface RT (which launched with Windows 8). It was a failed product for several reasons, but it succeeded in creating the foundation of what a good 2-in-1 PC is today.
Subsequent Surface versions created a billion dollar business with the slogan “a tablet that can replace your laptop”, and years later, Apple followed suit by adding support for mouse, keyboard and even a stylus —nonetheless, I’m convinced that Steve Jobs would have loved the Apple Pencil.
Foldable devices seem to be the last major tech trend of the decade judging from the most exciting announcements from 2019, led by the once-delayed Samsung Galaxy Fold, the surprising Surface Duo (Microsoft’s return to the smartphone space with Android as the device’s OS) and the resurrected Motorola Razr.
The distance between iterations of the most popular gadgets has become shorter year after year. This year’s iPhone is very similar to last year’s model, and the differences are only obvious when compared across a long period of time (i.e. 2010’s iPhone 4 vs 2019’s iPhone 11). Many critics have been talking about a slower innovation pace for years, so it will be interesting to see what new product(s) capture the imagination of consumers during the next decade.

Wearables

“Wearables” were not mainstream in 2010, mostly because the tech industry was still trying to find popular use cases beyond portable music. Through the decade, two categories became the focus: wristwatches and glasses.
Similarly to what happened with smartphones or tablets, Apple didn’t created the first smartwatch, but they learned from others like Fitbit, and created what today is one of the most popular smartwatches in the market. The first Apple Watch, introduced in 2015, struggled to decide if it wanted to be a productivity device or a fitness device, and it was heavily dependent on the iPhone; over time, the Cupertino giant was able to position the gadget as a health-focused watch with features like fall detection, heart-rate monitoring, EKG and more, while simultaneously making it a standalone device with its GPS and cellular radio capabilities.
Many other manufacturers like Sony or Samsung joined the battle for the best wearable, and flooded the market with options for every customer profile. Pebble also helped kickstart this space in 2012, and was later acquired by Fitbit in 2016 —Fitbit itself was acquired by Google in late 2019. Microsoft’s own attempt at a fitness band was short-lived, since it was announced by surprise in 2014 and sadly discontinued in 2016 after just two generations. Nonetheless, as we approach the end of the 2010s, the smartwatch space is vibrant and full of competition, with more customers jumping in every day.
Apart from smartwatches, the tech industry has been trying to make us wear smartglasses for a while. The first newsworthy attempt of reaching the consumer market was made in 2012 by Oculus, with a virtual reality (VR) headset known as Rift, and a year later by Google, with an augmented reality (AR) head-mounted display known as Google Glass. Google’s project never graduated from a developer experiment due to privacy concerns, since it was possible to record or take pictures without being noticed. Tech critics aligned against a future where silent surveillance is possible.
Microsoft tried a different approach to smartglasses with HoloLens, an AR head-mounted device introduced in 2016 that provided a glimpse into a future where our reality is not limited by the walls around us. That future is still far though, as HoloLens remains out of reach for the mainstream consumer due to its high price and how uncomfortable it is to wear for long periods of time.
More recent VR headsets released in 2018 and 2019 have improved in performance and independence —they don’t require to be wired to a powerful computer anymore. Oculus, which Facebook acquired in 2014, and other manufacturers have confirmed that there is consumer appetite for virtual and augmented experiences in our daily lives, and not just for gaming purposes. Smartglasses still look closer to helmets than actual glasses, so now we just need to wait for technology to catch up.
This series continues tomorrow with a look back at how technology has changed during the 2010s on transportation, social media, AI and automation. Stay tuned!
Did you like this article? Subscribe to get new posts by email.
Photo by Josh Newton on Unsplash

Tags

The Noonification banner

Subscribe to get your daily round-up of top tech stories!