Blending of AI and AR Key Features The book believes in the concept of teach by example. All the tools needed to facilitate quick understanding of complex concepts are provided in this book:
Definition of key terms
Industry studies, research statistics, etc., that clarify concepts
Spotlight sections
A Word of Caution sections
Chapter summaries
Questions for reflection
Description Artificial Intelligence Meets Augmented Reality: Redefining Regular Reality is a unique book as it presents the new technology paradigm of artificial intelligence (AI) and augmented reality (AR) and its full transition, right from major advantages that enhance entire industries to changing how the world operates at various levels. New realities will emerge in the context of our existing world through the combination of AI-AR. The book presents both the bright and bleak sides of the AI-AR duo in order to give a holistic view and help us to decide how we are going to leverage such technologiesâand whether their disruptive or transformative natureâwill mar or make the future of our world. A workforce of enlightened engineers is the key to designing and developing AI-AR solutions with responsibility in order to achieve the greater good. Through the book, Chitra Lele has explained a multidisciplinary, integrated approach as to how we can minimize barriers and blend AI and AR without destroying our natural settings. The book will help to chart out a path where there is no trail yet, and get you started on developing AI-AR solutions and experiences in bettering the world in an ethical and responsible manner. What Will You Learn
Dynamics of Artificial Intelligence and Augmented Reality
AI and AR Ecosystem
Business at the Crossroads of AI and AR
What does the AI-AR Marriage Hold for the Future of the World
Who This Book Is For Students, Academicians, Educationists, Professionals and Policy researchers. Table of Contents PART 1âDynamics of Artificial Intelligence and Augmented Reality
Introduction to Artificial Intelligence and Augmented Reality
AI and AR Ecosystem
PART 2âBusiness at the Crossroads of AI and AR
AI Meets AR in the Business Landscape
More Dynamics of the AI-AR Convergence
PART 3âWhat does the AI-AR Marriage Hold for the Future of the World
Collaboration of Intelligence and Augmentation in the Real World
Challenges and Solutions
Where do We Go from Here About the Author CHITRA LELE is a young software consultant, academic author and research scholar. She is a double postgraduate: Master in Computer Management and Master of Science in Software Engineering. Her publications include scholarly articles, research papers and academic books. She has been conferred with the title of "A Versatile Writer" by the India Book of Records for penning maximum number of books in a short span of eighteen months in various genres. LinkedIn Profile: linkedin.com/in/chitraleleauthorandconsultant
Dynamics of Artificial Intelligence and Augmented Reality
Chapter 1
Introduction to Artificial Intelligence and Augmented Reality
The world currently seems full of two-letter acronyms, especially AI (artificial intelligence) and AR (augmented reality). The Terminator film franchise portrays a future where AI is an integral part of day-to-day life; and yes, after all these years we are on that path right now. AI, or machine intelligence, is the simulation of human intelligence by machines like computer systems. AR is a technology that brings elements of the digital world and overlays them into the real world thereby enhancing our sensory perception of the same. And a combination or convergence of AI and AR can produce mind-boggling ideas, possibilities and results.
A few years ago, our virtual lives revolved around desktop computers and then the focus shifted to devices that landed right in our palms. A few years from now, the hub of our digital lives and all our activities will no longer be just limited to our IPhones or smartphones, but also involve new devices and interfaces driven by the AI-AR combination that will blur the line between what is real and what is not.
The combination of AI and AR is going to power up the next generation of tools, applications, services, experiences, and so on. Immersive computing with this convergence is going to change the way we work, live, entertain, educate, communicate, learn and share. Several industries stand to gain by this merger. For example, this combination can be used by the education industry where AI is used to deliver learning content, programs and tools through learning assistants and AR is used to provide an immersive and interactive environment to enhance the learning process of the learners and students. There is an endless universe of possibilities.
1.1 Artificial Intelligence
The term Artificial Intelligence (AI) often evokes mind-blowing images from fantasy books and science fiction movies. However, AI isnât science fiction at all; it is here and happening in the world and gaining more and more traction day by day. According to the International Data Corporation (IDC is the premier global provider of market intelligence, advisory services and events for the information technology, telecommunications and consumer technology markets) estimates, the AI market will be worth 47 billion US dollars by the year 2020.
According to John McCarthy, the father of Artificial Intelligence, AI is The science and engineering of making intelligent machines, especially intelligent computer programs. Forbes (it is a global media company focusing on business, investing, technology, entrepreneurship, leadership and lifestyle) defines AI as the broader concept of machines being able to carry out tasks in a way that we would consider âsmartâ.
Artificial is something that is not real; it is something that is synthetic and simulated. Intelligence is the ability to acquire knowledge and apply it; it is a sum total of various factors like problem-solving, logic, creativity, self-awareness and self-learning, and so on. Hence, AI is the simulation and emulation of human intelligence by machines and computer systems. AI makes it possible for machines to learn from experience and historical data using simple and/or complex algorithms and patterns.
AI is based on several disciplines like Biology, Mathematics, Engineering, Language, Computer Science, etc., and there are different types of technologies involved in AI research and applications like Deep Learning, Machine Learning, Virtual Agents and more.
1.1.1 History of AI
AI has come a long way from ancient mythology and anecdotes to its modern-day avatar in the form of robots, driverless cars, and so on. In fact, there has been no age, era or civilization without a mention of AI. Many ancient myths and legends speak about artificial entities and mechanical men. Greek myths talk about Hephaestus (a Greek god) who built giant robots, for example, Talos who was a programmed warrior to protect the island of Crete. Apart from Talos, Hephaestus had developed several other such mechanized systems that could feel and think like humans. An ancient philosophical book called Yoga Vasistha also dealt with the topic of artificial intelligence in the form of war machines and robots. Ancient automata appear in various tales of Medea, Jason, etc. It is interesting to see that the concepts and ideas of AI originated from ancient mythologies.
In 1920, Karel Äapek, a Czech playwright and writer, published a science fiction play named Rossumovi UniverzĂĄlnĂ Roboti (R.U.R = Rossumâs Universal Robots), and this play introduced the word robot. This play was about artificial people called robots who first worked for normal people and then they revolted against normal people which led to the extinction of normal humans. Pamela McCorduck, an American author, writes, AI began with an ancient wish to forge the gods.
The modern history of AI started around 100 years ago. Its origins date back to the work of Alan Turing, Allen Newell and Herbert Simon. Allen Turing, an English computer scientist, philosopher and mathematician, suggested that humans use information as well as reason in order to solve problems and make decisions, so why canât machines do the same thing? In 1950, Alan Turing proposed and developed the Turing Test for Intelligence as a measure of machine intelligence and it is still used today as a way to determine a machineâs ability to think like a human. In 1943, the foundation for neural networks was laid down by a paper titled âA logical calculus of the ideas imminent in nervous activity, in the Bulletin of Mathematical Biophysicsâ published by Warren McCulloch and Walter Pitts.
In the first half of the 20th century, science fiction popularized the concept of artificial intelligence robots among the general populace. In the year 1958, Herbert Simon, an American economist and political scientist, had declared that, within ten years, machines would become world chess champions if they were not barred from international competitions.
Before 1949 computers lacked a key requirement for intelligence and that is they could not store commands, they could only execute them. They lacked the required computational power. In other words, computers could be told what to do but couldnât remember what they did. Moreover, computing was extremely expensive. Due to these reasons, AI found limited growth during this time period. In the 1940s and 1950s, a handful of scientists, engineers, philosophers and mathematicians contemplated about the possibility of creating an artificial brain.
The term Artificial Intelligence was coined in 1956 by John McCarthy, the father of Artificial Intelligence, at the historic Dartmouth Conference, the first ever artificial intelligence conference. It is from this conference onwards that AI and its progress kicked off; it triggered the next twenty years of AI research. In 1958, McCarthy developed the Lisp (an acronym for list processing) computer language, which became the standard AI programming language and continues to be used even today and also it made voice recognition technology possible. This is the period when AI became a genuine science. But during this period, the AI algorithms were basic and not that efficient.
By the mid-1960s, the progress in this field had slowed down and AI had received bad press for about a decade. Funding for AI research was cut down. This period was called the (first) AI Winter. During such a period or hype cycle, interest in AI would begin with a boom in research and funding and end with a bust period of reduced research and funding. The research still continued but in a new direction. Now, its focus shifted to simulating the psychology of memory and the mechanism of understanding through computers. From the period 1957 to 1974, several promising developments in machine learning algorithms occurred, for example, Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology, created the first chatterbot program called Eliza that could mimic human conversation. A breakthrough in AI, especially in neural network research came in the form of backpropagation algorithm by the scientist Paul Werbos in 1974. These developments led to the introduction of expert systems, which were further developed in the 1980s. The first AI Winter ended with the introduction of Expert systems. Expert systems are programs that help to find answers to problems in a specific domain.
The second AI Winter came in the late 80s and early 90s after a series of financial setbacks. Thereafter, AI interest began to gain traction and interest again. Technical progress led to the development of machine learning algorithms, which led to further developments in this field where several disciplines were used to produce hybrid systems that were used in industrial applications like speech recognition, fingerprint identification, and so on. David Rumelhart and John Hopfield popularized deep learning techniques which allowed computers to learn using experience.
During the 1990s and 2000s, many of the major milestones and goals of AI were achieved. In 1997, Gary Kasparov, the reigning world chess champion and grand master, was defeated by IBMâs Deep Blue (a chess playing computer program). In the same year, speech recognition software, developed by Dragon Systems was implemented on Windows. Over time, as computer storage and processing speed increased exponentially, AI and its capabilities got better and better and they are reflected everywhere, right from technology to entertainment and from banking to finance.
The last two decades have witnessed a tremendous growth in AI. In 2017, the AI market had reached the 8 billion US dollars mark. In present times, tech giants like Google, Microsoft, etc., are studying, researching and implementing a wide range of artificial intelligence projects.
A Word of Caution
Behind these techno-wonders, lies a search for perpetual life and the incessant quest of immortality. There is a strong possibility that the concept of Posthuman may replace organic consciousness completely with synthetic artificial intelligence.
Figure 1.1 History of Artificial Intelligence
1.1.2 AI and the Fourth Industrial Revolution
AI has been described as the Fourth Industrial Revolution or Industry 4.0. The earlier industrial revolutions (mechanization, mass production, digital, and now fourth is about the merging of physical, biological and digital domains) were about automating mundane tasks of the workforce. But AI is about automating intelligent labor. With the Fourth Industrial Revolution powered by AI, it is about automating complex tasks that require tons of data (which human beings are incapable to scour through and analyze the same). Several industry experts are of the opinion that most data-information-rich type of white collar jobs will be replaced very soon. Now the trend in the fourth revolution is that of moving away from the automated towards the autonomous. And this trend is viewed differently by different nations; the non-Western nationsâ attitude towards the new technologies, including AI and AR, often differs from those of the Western nations. One thing is for sure that just like the other revolutions, this revolution will also go through various tumultuous twists and turns.
In 2018, the Future of Humanity Institute (FHI is a multidisciplinary research organization) at the Oxford University released a study that claimed AI will outperform humans in many activities in the next ten years. AI is no longer limited to capturing and analyzing straightforward data. It has already stepped into developing âtacitâ knowledge, and at the bottom of all this is the phenomenal explosion of data. This data is critical for machine learning and AI as they need it for training, learning and perfecting themselves. The more data there is, the better the applications of AI will be. The merger of AI and Big Data in this fourth industrial revolution is the beginning of the next level of intelligence called Data Intelligence. AI and the fourth industrial revolution are about the speed of embracing this data intelligence economy. In such a kind of economy, the demand for data professionals is bound to increase.
AI is no longer limited to specific tasks or segmentsâright from automobiles to self-service checkouts and from language translation to retailâit is making its presence felt. It is already reshaping both local and global markets through the evolution of machine learning. According to the World Economic Forum (it is an independent international organization committed to improving the state of the world through public-private cooperation), âThe individuals who will succeed in the economy of the future will be those who can complement the work done by mechanical or algorithmic technologies, and âwork with the machinesâ.â In other words, human resources will need to become agile by developing a new skill-set that will match this new revolution powered by AI.
Mitre (it is an American not-for-profit organization that manages federally funded research and development centers supporting several U.S. government agencies) and leading technology companies are fuelling the initiative called Generation AI Nexus. The main aim of this initiative is to provide American students with access to AI training, tools and big data so that they can become AI-ready and overcome the gaps in employment by doing workforce reengineering. This initiative is made possible due to the partnership among the government, companies and academic institutions. This initiative will be supported by Mitreâs analytic framework called Symphony that contains a comprehensive set of machine learning and AI tools. This initiative aims to reach out to 400 universities by the year 2024. India, too, is taking part in the AI-driven fourth industrial revolution. For the past several years, India has been launching various initiatives to boost its Digital India movement and to implement the various cutting-edge, emerging technologies, including AI. The Centre for the Fourth Industrial Revolution recently opened in India by the World Economic Forum aims at designing new policy protocols and frameworks for these emerging technologies. According to industry figures, it is expected to be a 1-trillion-dollar industry in the next 5-7 years. Digital India is g...
Table des matiĂšres
Cover
Artificial Intelligence Meets Augmented Reality
Copyright
Acknowledgements
Preface
Foreword
Table of Contents
List of Figures
Part 1âDynamics of Artificial Intelligence and Augmented Reality
Part 2âBusiness at the Crossroads of AI and AR
Part 3âWhat does the AI-AR Marriage Hold for the Future of the World
Normes de citation pour Artificial Intelligence Meets Augmented Reality
APA 6 Citation
Lele, C. (2020). Artificial Intelligence meets Augmented Reality ([edition unavailable]). BPB Publications. Retrieved from https://www.perlego.com/book/2028279/artificial-intelligence-meets-augmented-reality-redefining-regular-reality-pdf (Original work published 2020)