paint-brush
On FuchsiaOS, AI assistants and software optimisationby@edoardo849
735 reads
735 reads

On FuchsiaOS, AI assistants and software optimisation

by Edoardo Paolo ScalafiottiFebruary 12th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Below a list of emergening trends in technology that, as developers, we should probably keep in mind both in our professional life and in designing our career path.
featured image - On FuchsiaOS, AI assistants and software optimisation
Edoardo Paolo Scalafiotti HackerNoon profile picture

Software - not hardware- will drive performance in the AI-era, so Rust, Go, Swift and Dart could be the best option to future-proof a dev’s career

Below a list of emergening trends in technology that, as developers, we should probably keep in mind both in our professional life and in designing our career path.

  • Moore’s trend (aka please do not call it law) is over. CPUs will start getting physically bigger again: some of the Apple A11’s secret sauce over Qualcomm’s Snapdragon 835 could be identified in a bigger surface area
  • Optimization is where enhanced performance will come from, and it will become a bigger and bigger deal than before, since hardware won’t be there to cover software’s shortcomings anymore (oh and by the way, we might actually have new systems with lower specs because of Meltdown and Spectre)
  • Hardware fatigue might ensue. Since incremental enhancements in smartphones and laptops are basically coming from revamped design and camera features, original manufacturers will need to up their game in justifing the purchase of a new version of a gadget, specifically:
  • AI — based software features (that might just conveniently require a specific chip) is where gadget manufacturers will look for growth. The Pixel 2 embraces this trend: a higher DxOMark score for a camera won’t cut it anymore
  • AI— based assistants and related features will become central: at the moment it seems that Google is the most well placed player in mobile, and Alexa in the SmartHome. Apple’s Siri is years behind the incumbents, which raises the question whether or not Apple went to market too early with their AI-assistant — a very non-Jobs move for a company famous for arriving late at the game with the intention of getting it perfectly right. Regarding Samsung, it seems that the most required feature of Bixby is the ability to re-map the button that triggers it to the Google Assistant
  • If the above trend continues, expect the emergence of an Open Source Assistant. “But… what about the seed data to train the Machine Learning algorithm?” Hm, what about a distributed system like this?
  • AI — based OS will ensue. Enter Google’s Fuchsia, an Operating System that has all the ambition of becoming the new Android, both for mobile, desktop and wereables (“this is the year of the convergence” has become the new “this is the year of the Desktop Linux”).

Fuchsia looks very interesting for several reasons:

  • It has a new UI paradigm based on a GoogleNow-like feed organized around Stories and Modules: it has to be seen whether or not this will end-up like the famously tragic Windows 10 move when they tried to remove the Desktop only to put it back shortly after
  • Fuchsia is a clean-slate: it’s not Linux nor Unix. It uses a new kernel called Zircon and as such it won’t have to bear any legacy “feature”
  • Fuchsia is being built with and will support performance-first programming languages (promoting good coding practices). Specifically, the old behemoths are still there (C and C++) but the new cool kids on the block are too (Rust, Go and Swift). Plus, it has a UI built on Flutter, which uses the statically typed Dart programming language

Regardless of the success of Fuchsia (which will be likely released in 2019 or 2020), I personally believe to be very wise for any developer that wonders what to add to their dev toolbelt to basically observe whathever Programming Language Fuchsia is supporting and learn it:

  • The above is not because of a blind faith in Google’s OS, but because it follows a clear pattern: interpreted PLs won’t stand the test of the AI era, where high-computing power is necessary to train a model or perform core computation and no space will be left for the interpreter itself. Of course we won’t see the end of the likes of Java, Python and JavaScript, but to future-proof a career I’d rather go with the statically-typed, compiled blokes
  • If you are into FrontEnd you should learn TypeScript (used in both React and Angular) and keep a very open mind about Dart and Flutter for unifying a mobile codebase (Ionic, Cordova and any webview-based framework won’t withstand the test of performance)
  • Also, with the upcoming release of WebAssembly (a new compilation target for the browser) JavaScript’s preeminence will likely scale-down over the next years. This won’t happen overnight: but if you are a certain smartphone manufacturer trying to optimize the usage of your battery, won’t you push your developers into adopting more performant technologies? (Does anyone remembers what happened to Flash?)
  • If you are into BackEnd Microservices, Go should definetely be part of your toolkit. I personally find some Rust concepts utterly over-complicated for the server realm but hey, security first, right?
  • If you are into IoT or embedded systems, besides the usual C and C++, Rust should be on your horizon