The Future of Computing (Ubiquitous Computing)

Hi, thanks for tuning into Singularity Prosperity. This video is the twelfth and final in a multi-part series discussing computing. In this video, we'll be discussing the future of computing, more specifically – the evolution of the field of computing and extrapolating forward based on topics we discussed so far in this series! There have been three primary eras in the evolution of the field of computing since its inception. From as early as 3000BC extending to the 1940s is the tabulating era. This era of computing was focused on simple calculations and data collection, encompassing many technologies, to list a few: the abacus, the Pascaline mechanical calculator, the difference engine and the census tabulator. From the premise of storing and manipulating data came about the second era of computing, the programming era, beginning in the 1950s. From the inventions and theories of many great minds, this era allowed for complex instructions to be carried out by computers by creating a language for them that could be represented with physical hardware, machine language, in other words, binary. As time progressed and as is the case now, layers of abstraction allow programming in higher-level more human-friendly languages, which are then compiled down to machine language. We are still very much in this era of computing, however, since 2010 the field of computing has been going through an other disruptive period, a transition to the third era of computing, the cognitive era – allowing machines to execute based on learning from large amounts of data instead of executing pre-programmed instructions. As the field of computing has been evolving towards the cognitive era, another paradigm shift in computing is accelerating the transition, infinite computing. Before continuing, infinite computing encompasses many of the topics we discussed in previous videos in this series, be sure to check them out if you want more detailed information on certain topics. Back on topic, infinite computing refers to the use of cloud computing, computing as a utility, powered by the principles of heterogeneous architecture. The cloud is currently powered through a combination of CPUs, GPUs, FPGAs, ASICs and memory devices – each of which play a pivotal role. These devices have been and will continually increase in performance and efficiency due to: shrinking transistors, new materials, 3D-integrated circuits, new memory devices and standards as well as other optimizations in software that are made to be tightly coupled with hardware. Beyond these classical compute technologies, new fields of computing able to truly operate in parallel will also be utilized and are slowly starting to be used in the cloud as well: 1) Optical computing, which will provide significant speed ups in data transfer as well as compute devices. 2) Quantum computing, which will be able to solve new types of problems and reduce the probability space in optimization problems. 3) Clockless- parallel neuromorphic architectures, which are based on the human brain and will provide significant performance and efficiency gains in artificial intelligence applications. Under heterogeneous architecture, all these devices, architectures and paradigms will work together in the cloud based on the needs of the end-user to provide maximum performance and efficiency for a desired task – hence the given name of this paradigm, infinite computing. Infinite computing will be an additional tool for computer enthusiasts, scientists, etc and an alternative computing method for the majority of the world's population. For some examples, let's look at some resources that may be allocated to certain types of individuals or groups, ranging from light usage to a business or startup. As you can see, the future of computing will be truly personal and optimized for your specific needs. It is worth mentioning that this paradigm of computing is centralized and trades off accessibility for potential privacy and security concerns. A decentralized cloud based on blockchain principles could change this, however this is a topic for a future video! Beyond the impact to groups or individual users, the principle of infinite computing also applies to our devices, called edge devices. For example, referring back to the last era of computing, cognitive computing, this means pairing AI ASICs in devices with the cloud. How I personally envision this occurring is, implementing quantum computing practices to reduce the probability space of the numerous potential machine learning models a problem may require, and then neuromorphic architectures paired with classical Von Neumann architectures will train and infer based of the highest probability match and return the model to our devices. There is also a positive feedback loop which will affect the entire field of computing, where increased computing leads to better AI, leading to increased computing due to new architectures, paradigms, devices, etc – produced via the machine learning models and the loop goes on and on. It is important to note, that there are countless directions the industry can end up taking and the views stated here are my extrapolation based on the information we currently have, to add to this, there are some forms of computing we have yet to even discuss, such as spintronics and bio or DNA computing, a topic best left for this channels future biotechs series. One thing is for certain however, in the coming years and decades, computing will truly encompass everything, this is referred to as ubiquitous computing. Ubiquitous computing is a concept of computing fueled by the rise of abundant, affordable and smart computing devices, where computing is done using any device, in any location and in any format. This can range from devices such as a laptop, desktop and mobile phones to smartwatches, refrigerators, glasses, warehouse automation – the list is truly endless! Along with global connectivity and big data, ubiquitous computing is the next pillar of the technological revolution that is currently occurring and will be covered much more in-depth in this channels video on ambient intelligence! Moving forward, in the next set of videos on this channel, we'll cover the final pillar of the, Fourth Industrial Revolution, artificial intelligence, and then afterwards, how these pillars working in conjunction will impact and shape the future! At this point the video has come to conclusion, I'd like to thank you for taking the time to watch it! If you enjoyed it, consider supporting me on Patreon to keep this channel growing and if you have any topic suggestions, please leave them in the comments below! Consider subscribing for more content, follow my Medium publication for accompanying blogs and like my Facebook page for more bite-sized chunks of content. This has been Ankur, you've been watching Singularity Prosperity and I'll see you again soon!

28 thoughts on “The Future of Computing (Ubiquitous Computing)

  1. Join our Discord server for much better community discussions!

  2. I liked your videos so much that I binged them.
    Sorry, I'm low on finances. So, I can't support you on patreon.

  3. very good information about the topic, very nice way to explain the topic keep Rocking go on..
    i want more information from your channel thank you sir..

  4. Awsome video! Really cool when you find new amazing channels! But may i ask what your thoughts on Ray Kurzweil is? And he's predictions?

  5. Your videos always leave me with endless thoughts and no words. Honestly, how do you not have a milion subscribers?

  6. 4:23 "There is also positive feedback loops, which will affect the entire field of computing, where increased computing leads to better AI, leading to increased computing due to new architectures, paradigms, ect". As far as I know there is no AI in those fields, but I hope it will be used in the future. Although I heard that google had used AI to optimize their servers ventilation. Now that I think about it, we live in exponential growth era, where innovation in one area has positive effects on all reaserch in any area. For example AI may be used for better control of qubits in quatum computer, which then is used in material science to produce better computer. However I do not think that AI will be used directly in computer development to this purpose we need more reasoning from/on data AI. Currently AI either reasons based on human defined rules or makes "intuision" based decisions with no reasoning. I think only first type is used in hardware design here are some evidence -> As I thought mostly low level stuff and rule base.

  7. Cryptocurrency mining is not about mining a currency, it's about getting people used to the idea that they can use their personal hardware and electricity as a service for global cloud networks.
    There is already viruses and such that use your cpu and gpu at 100% load to mine data using your electricity.
    Bit-torrent technology, along with blockchain, could be put together to be a "peoples cloud", or "p2p cloud", where we use each others spare disk space and spare cpu and gpu power to build p2p cloud networks, rather than handing privacy and censorship over to technogiants.
    The more credits or cpu/gpu/storage sharing you acquire could then be exchanged for perks within the p2p cloud system, or maybe exchanged for a currency.
    Bit torrent was a really nice piece of software, and it's time for an even more complicated p2p service that can utilize our overpowered home hardware.

  8. This channel is amazing, truly interesting stuff in your videos. Very informative and inspiring. Looking forward for more!

Leave a Reply

Your email address will not be published. Required fields are marked *