We’re in a major transition, but we just don’t notice: Part 1

Imagining the future

You return home, after a long day at work. As you come in the door, you feel a tapping on your wrist. You lift it and see a new notification on your watch reminding you to start dinner at 6pm. You dismiss the notification and put on a music playlist which blares from your home stereo. Remembering your wife is still taking a nap, you quickly switch the music output from the stereo to the wireless headphones in your pocket. You take off your watch and take your phone out of your pocket and place them on the living room coffee table. They start charging immediately.

You sit down on the couch, reach over to the side table on your left, and grab your tablet. You start playing the Legend of Zelda Online with your friends.

About a half our goes by when you wife emerges from the bedroom. She wants to edit some video footage she took that afternoon for her vlog on the TV. With a swipe on her camera’s touchscreen, the file transfers instantly to her laptop. She docks her laptop beside the TV and her video project appears. Using the phone’s motion controls and the touchscreen she edits, and adds new titles and transitions.

Still wanting to play your video game, you put down your tablet and migrate to your office. You turn on your desktop (yes, we still have desktops in this future) and start playing Zelda where you left off - this time with a real game controller. You tap on your headphones twice which enables your digital assistant. You say “set lights to game mode” and the office lights dim to a faint glow.

What is going on?

2015 and 2016 were weird years. We got a really big iPad. Phone manufacturers (namely Apple) started removed headphone jacks. The smartwatch industry is on life support. Automation and artificial intelligence is around every corner. We need dongles to connect things again. USB-C is all the rage…

To spectators of the tech industry, it would seem as though companies are floundering. Are they scrambling to find the next big thing? Are they experimenting? Unlikely. It’s more likely we’re in the middle of perhaps the greatest transition the mobile technology industry has ever seen, and it’s all to move us toward greater standardization… mostly.

Wireless future

Apple justified the removal of the headphones jack - the last legacy port - from the iPhone 7 in September. Despite the rumours, there was an understandable uproar. I’m certain the iPad is next on the chopping block. Their justification? A wireless future is ahead.

Removing the headphone jack from the iPhone 7 angered many customers and was justified by the fact that we're moving toward a wireless future.

While tech pundits see this strategy as an excuse to peddle more expensive wireless headphones, it doesn’t change the fact that this wireless future is probably true. Many Android phones have and will continue to ship without headphone jacks.

NDP reported that 2016 was the first year that wireless bluetooth headphones outsold traditional wired ones, and it’s easy to see why. Unlike many technologies, wireless headphones are the cord cutters’s dream. Being able to walk around your home without being attached to a device really does feel like the future. With standards like Bluetooth getting better (or as companies like Apple find workarounds using specialized hardware), we will soon forget what it was like to have a cable.

However, the coming wireless future extends beyond headphones. Wireless charging will certainly become the mainstream once a standard is agreed upon and the iPhone supports it. It’s just not there yet.

Inductive charging, while theoretically more convenient than a wire, needs to align perfectly. If your smartwatch or phone isn’t positioned perfectly on the wireless charging pad or inductive charger, you might wake up in the morning to find a device with an empty battery. Clearly, current wireless charging isn’t perfect, but it’s not hard to imagine that once range and efficiency issues are solved, it will become the norm.

File transfer is already done wirelessly. More and more cameras and devices support WiFi transfer, but there’s some hurdles. Big files transfer slowly without a wire, and as video resolutions move from 4K to 8K, we’re going to be relying primarily on wires for a while.

But that doesn’t mean we won’t get there. We already transfer audio through bluetooth and files through Airdrop and NFC. The price of cloud storage is dropping. Our devices can already sync to the cloud automatically, and faster WiFi and cellular speeds will make it seem instantaneous. One major hurdle is rural-broadband infrastructure. While citizens in major cities have access to the fastest Internet, those in rural locations have yet to see local investment in fibre networks. There’s a huge digital divide among urban and rural populations, and it affects the latter’s ability to participate in society.

One port to rule them all

USB-C will (hopefully) be your next connector. CC0 Image

I love HDMI, DisplayPort, Thunderbolt, and USB. But there’s too many. The more ports devices are required to have, the more expensive devices will be to manufacture.

Ten years ago, we were in the same boat. Most Windows laptops in the mid 2000s came standard with VGA or DVI ports, and users were dragged kicking and screaming when we moved to digital inputs like HDMI and DisplayPort.

We’re feeling the transition pain once again. We had just got used to having no optical drives and rid ourselves of most of our dongles, and then Apple goes and releases a MacBook Pro with only four USB-C ports. Dongles are back, at least for the next couple years.

However, a port that does data transfer, video out, and charging all at the same time would have been unthinkable ten years ago. The transition to USB-C is already happening - just not as fast as we’d like. That being said, many Windows machines, Chromebooks, phones, and some tablets are shipping with USB-C as their primary - or only - port. Dongles are a necessary inconvenience until we reach a greater level of standardization.

A port that does data transfer, video out, and charing all at the same time would have been unthinkable ten years ago

The obvious holdout here is Apple’s iPhone and iPad, both of which use their proprietary Lightning port. Because Lightning was introduced before USB-C, and has become a standard for accessory makers, it’s unlikely Apple will make the switch to USB-C.

Moving as many devices as possible to USB-C will result in greater interoperability and convenience. The problem right now is that not all USB-C inputs have the same capability. It’s also likely that there will be iterations of the USB-C connector, just like the move from USB 2.0 to 3.0. Still, a connector for everything is still a huge improvement.

The move to ARM

For decades, the x86 standard has completely dominated the personal computer market. It’s a robust standard known for it’s ability to provide cutting edge performance and some decent power efficiency. But it doesn’t work so well in phones and tablets, which is why the ARM standard has flourished.

Furthermore, advancements in x86 have slowed considerably over the past seven years. Intel is essentially the only game in town, and the improvements are incremental and more focused on power saving than raw performance.

Chromebooks are already running on ARM, however, and with the rumour that Apple will start shipping hybrid ARM/x86 laptops it’s not hard to imagine ARM being the next standard. Why? Computer sales data shows that mobile devices have eclipsed desktops and chips need to be efficient to ensure long battery life.

Gaming is the obvious holdout, and there will probably continue to be a profitable market for gaming PCs. But, the console front is a different story. Even Nintendo has jumped on the ARM bandwagon with its new Switch console, positioning itself very differently than Sony and Microsoft.

It’s unlikely that x86 will go away altogether. It’s possible that x86 will remain a niche standard, and ARM will take over as the driving chip standard. That being said, Windows 10 on ARM has been announced. Dedicated ARM support from Microsoft or Apple could quickly change the landscape - making x86 unnecessary. And what’s wrong with that?

Developers could make apps that run seamlessly on all devices. Instead of focusing on resolving interoperability issues, developers could focus on improving the user experience by customizing their app interfaces for different devices and use cases.


Transition is painful, and this is the mother of all transitions. Many of the technologies of the future are here and the challenge is making them all work together. I see the mobile computing revolution that the iPhone started just beginning. The last ten years have been a testing ground for the convergence that’s happening today.

In the future, there will be no wires, all devices will use the same architecture, and we’ll all connect using USB-C. It will be glorious…

Until USB-D is announced…