The release of Ready Player One was a significant moment for the virtual reality industry as it presented a VR-driven future that, like all good science fiction writing, draws on familiar themes recognizable by everyone today.
The release of Ready Player One this weekend is a significant moment for the virtual reality industry. Ernest Cline’s bestselling science fiction adventure will achieve a Spielberg-fueled launch to new heights of popular culture fame. It describes a world where living in the virtual reality of the OASIS becomes preferable to living within a dystopian future society riddled by war and energy crises. The movie will present a VR-driven future that, like all good science fiction writing, draws on strong familiar themes recognizable by everyone today, right down to the hardware described in the film.
In the story, the OASIS can be accessed virtually free of charge by anybody that is in possession of both an OASIS visor and a pair of haptic gloves. The visor already looks familiar. Whilst they are smaller and lighter, with greatly improved performance over the devices of today, anybody with $800 / £500 to spare can own a similar looking headset as soon as their favorite online retailer can ship it to them. The differences between today’s technology and that described in the visor are well characterized, and there are already billions of dollars being spent in efforts to close the gap.
The other essential item for accessing the OASIS is a pair of haptic gloves. In Cline’s own words, “When you picked up objects, opened doors, or operated vehicles, the haptic gloves made you feel these nonexistent objects and surfaces as if they were really right there in front of you.” (Ready Player One, p.58). Those that can afford it can then upgrade to higher levels of immersivity, with full-body suits, harnesses, omni-treadmills and even smell towers, built to serve the user’s every touch, smell, move and desire.
These products all have various reference points within VR offerings today. The standard haptics offering within consumer VR today is a pair of controllers (HTC’s Vive Controller, Oculus Touch & Sony’s PlayStation Move, etc.) containing various sensors and inertial haptic feedback driven by one of several types of electromagnetic motor. These controllers are a simple commercial starting point, and already there are many types of additional accessories and configurations that add variety to the haptics offerings. For example, Tactical Haptics have demonstrated reconfigurable controllers which integrate shear force surface haptics as accessories on top of existing VR controllers. Both the controller designs and the haptic sensations which they enable are becoming more diverse.
The next step is to go from controllers to wearable systems such as rings or gloves, and ultimately towards apparel. It is possible to integrate similar electromagnetic motors into rings (e.g. GoTouch VR) or into gloves (e.g. Virtuix), but development increasingly tends towards the use of flexible actuators which can better match the properties of the textiles and skin with which they interact. One prominent example of this is the piezoelectric EAP (electroactive polymer) from Novasentis, who are working with many leading VR players on future product opportunities. This can be extended up to full-body apparel, where different types of actuator can be integrated throughout an entire suit (e.g. see TeslaSuit’s solution, which uses direct electrical stimulation of the muscles).
Up one step further and the systems become larger, more bespoke and more expensive. Custom haptics controllers and motion simulators have been developed to allow the user to experience the forces and sensation associated with different virtual scenarios. These range from passive systems (e.g. Icaros’ personal flight simulator) through personal and even multiplayer simulators, not unlike those within video arcades or VR theme park attractions. Omni-treadmills will become part of this, sensing the direction of motion and then literally moving the floor under the user to enable full motion in VR. We have now reached the very high end of the industry, with extremely high prices and severely limited availability to a much greater extent than anything suggested in Ready Player One.
As the number of users, sophistication of content, and total revenue in VR grows, these higher end haptics options will, in many cases, begin to see the increases in volumes and drop in prices that we can expect from any growing hardware industry. However, from a development point of view, it is important to focus on the areas in which today’s offerings are fundamentally lacking. To do this, we must break “haptics” into parts including tactile and kinaesthetic, plus other categories such as thermal sensation. The majority of haptic solutions, and nearly all of the consumer-ready haptics options today, are tactile. This includes any kind of vibration or surface textural change where there is no net force applied to the user. Where net force is applied, this is kinaesthetic feedback. This is typically deployed via devices such as body-anchored exoskeletons, or grounded manipulandums or robotic systems. A broad overview of the different categories of tactile and kinaesthetic devices used for haptics in VR today is shown in the image.
However, this classification outlines one key difference between the haptics devices in Ready Player One and those in our reality today. Nearly all of the consumer-ready products, and even those which are pre-commercial but likely to reach consumer markets within the next year or two, involve tactile feedback. Kinaesthetic feedback (i.e. where the net force of the actuator on the body is non-zero) is found in high end and custom systems, but is severely lacking in accessible consumer products. Advanced haptics features such a thermal variation are even rare in high end systems.
This is far from saying that creating the kinds of sensations described in Ready Player One are impossible. In fact, we can look to companies like HaptX (formerly Axon VR) who develop high end haptics solutions involving tactile, kinaesthetic and thermal feedback, achieved by combining several actuator technologies together. Today their efforts focus on gloves, but ultimately their goal is to produce products very similar to the full body haptics suits described in Ready Player One. In fact, Cline’s description in the book referring to “an elaborate exoskeleton” for kinaesthetic feedback combined with “a web like network of miniature actuators” against the skin is relatively close to a likely design strategy if you asked this suit to be developed today (maybe with some additional microfluidics).
However, whilst these products could be achievable, they are far from being shipped as part of consumer VR bundles. For example, HaptX’s gloves will remain as extremely high end devices (e.g. for military simulation) for the foreseeable future, costing several orders of magnitude more than would be tolerable for a consumer device. We will rely on elements of technology like this eventually trickling down to consumer markets before we can expect to touch and feel our way through our own virtual Easter egg hunts.
As Ready Player One is released in cinemas around the world, IDTechEx are also releasing their latest market research report after four years covering the haptics industry. Haptics 2018-2028: Technologies, Markets and Players details the entire haptics industry today, including description and benchmarking of different haptics technologies, data and forecasting on haptics markets, and 35 different company profiles (as part of 66 companies covered). For example, IDTechEx has included interviews and/or other primary research from each of the companies mentioned in this article, detailing aspects of their achievement and technology, plus providing critical analyses of their progress via ranking and SWOT analyses. This report is one of nearly 100 high quality market research reports offered by IDTechEx, and fits alongside parallel topics such as Augmented, Mixed and Virtual Reality, Wearable Technology and other User Interfaces reports to provide comprehensive coverage as these industries evolve.
Now IBM’s Watson joins IoT revolution in agriculture
Global expansion of the Watson Decision Platform taps into AI, weather and IoT data to boost production
IBM has announced the global expansion of Watson Decision Platform for Agriculture, with AI technology tailored for new crops and specific regions to help feed a growing population. For the first time, IBM is providing a global agriculture solution that combines predictive technology with data from The Weather Company, an IBM Business, and IoT data to help give farmers around the world greater insights about planning, ploughing, planting, spraying and harvesting.
By 2050, the world will need to feed two billion more people without an increase in arable land . IBM is combining power weather data – including historical, current and forecast data and weather prediction models from The Weather Company – with crop models to help improve yield forecast accuracy, generate value, and increase both farm production and profitability.
Roric Paulman, owner/operator of Paulman Farms in Southwest Nebraska, said: “As a farmer, the wild card is always weather. IBM overlays weather details with my own data and historical information to help me apply, verify, and make decisions. For example, our farm is in a highly restricted water basin, so the ability to better anticipate rain not only saves me money but also helps me save precious natural resources.”
New crop models include corn, wheat, soy, cotton, sorghum, barley, sugar cane and potato, with more coming soon. These models will now be available in the Africa, U.S. Canada, Mexico, and Brazil, as well as new markets across Europe and Australia.
Kristen Lauria, general manager of Watson Media and Weather Solutions at IBM, said: “These days farmers don’t just farm food, they also cultivate data – from drones flying over fields to smart irrigation systems, and IoT sensors affixed to combines, seeders, sprayers and other equipment. Most of the time, this data is left on the vine — never analysed or used to derive insights. Watson Decision Platform for Agriculture aims to change that by offering tools and solutions to help growers make more informed decisions about their crops.”
The average farm generates an estimated 500,000 data points per day, which will grow to 4 million data points by 2036 . Applying AI and analysis to aggregated field, machine and environmental data can help improve shared insights between growers and enterprises across the agriculture ecosystem. With a better view of the fields, growers can see what’s working on certain farms and share best practices with other farmers. The platform assesses data in an electronic field record to identify and communicate crop management patterns and insights. Enterprise businesses such as food companies, grain processors, or produce distributors can then work with farmers to leverage those insights. It helps track crop yield as well as the environmental, weather and plant biologic conditions that go into a good or bad yield, such as irrigation management, pest and disease risk analysis and cohort analysis for comparing similar subsets of fields.
The result isn’t just more productive farmers. Watson Decision Platform for Agriculture could help a livestock company eliminate a certain mold or fungus from feed supply grains or help identify the best crop irrigation practices for farmers to use in drought-stricken areas like California. It could help deliver the perfect French fry for a fast food chain that needs longer – not fatter – potatoes from its network of growers. Or it could help a beer distributor produce a more affordable premium beer by growing higher quality barley that meets the standard required to become malting barley.
Watson Decision Platform for Agriculture is built on IBM PAIRS Geoscope from IBM Research, which quickly processes massive, complex geospatial and time-based datasets collected by satellites, drones, aerial flights, millions of IoT sensors and weather models. It crunches large, complex data and creates insights quickly and easily so farmers and food companies can focus on growing crops for global communities.
IBM and The Weather Company help the agriculture industry find value in weather insights. IBM Research collaborates with start up Hello Tractor to integrate The Weather Company data, remote sensing data (e.g., satellite), and IoT data from tractors. IBM also works with crop nutrition leader Yara to include hyperlocal weather forecasts in its digital platform for real-time recommendations, tailored to specific fields or crops. IBM acquired The Weather Company in 2016 and has since been helping clients better understand and mitigate the cost of weather on their businesses. The global expansion of Watson Decision Platform for Agriculture is the latest innovation in IBM’s efforts to make weather a more predictable business consideration. Also just announced, Weather Signals is a new AI-based tool that merges The Weather Company data with a company’s own operations data to reveal how minor fluctuations in weather affects business.
The combination of rich weather forecast data from The Weather Company and IBM’s AI and Cloud technologies is designed to provide a unique capability, which is being leveraged by agriculture, energy and utility companies, airlines, retailers and many others to make informed business decisions.
 The UN Department of Economic and Social Affairs, “World Population Prospects: The 2017 Revision”
 Business Insider Intelligence, 2016 report: https://www.businessinsider.com/internet-of-things-smart-agriculture-2016-10
What if Amazon used AI to take on factories?
By ANTONY BOURNE, IFS Global Industry Director for Manufacturing
Amazon recently announced record profits of $3.03bn, breaking its own record for the third consecutive time. However, Amazon appears to be at a crossroads as to where it heads next. Beyond pouring additional energy into Amazon Prime, many have wondered whether the company may decide to enter an entirely new sector such as manufacturing to drive future growth, after all, it seems a logical step for the company with its finger in so many pies.
At this point, it is unclear whether Amazon would truly ‘get its hands dirty’ by manufacturing its own products on a grand scale. But what if it did? It’s worth exploring this reality. What if Amazon did decide to move into manufacturing, a sector dominated by traditional firms and one that is yet to see an explosive tech rival enter? After all, many similarly positioned tech giants have stuck to providing data analytics services or consulting to these firms rather than genuinely engaging with and analysing manufacturing techniques directly.
If Amazon did factories
If Amazon decided to take a step into manufacturing, it is likely that they could use the Echo range as a template of what AI can achieve. In recent years,Amazon gained expertise on the way to designing its Echo home speaker range that features Alexa, an artificial intelligence and IoT-based digital assistant.Amazon could replicate a similar form with the deployment of AI and Industrial IoT (IIoT) to create an autonomously-run smart manufacturing plant. Such a plant could feature IIoT sensors to enable the machinery to be run remotely and self-aware; managing external inputs and outputs such as supply deliveries and the shipping of finished goods. Just-in-time logistics would remove the need for warehousing while other machines could be placed in charge of maintenance using AI and remote access. Through this, Amazon could radically reduce the need for human labour and interaction in manufacturing as the use of AI, IIoT and data analytics will leave only the human role for monitoring and strategic evaluation. Amazon has been using autonomous robots in their logistics and distribution centres since 2017. As demonstrated with the Echo range, this technology is available now, with the full capabilities of Blockchain and 5G soon to be realised and allowing an exponentially-increased amount of data to be received, processed and communicated.
Manufacturing with knowledge
Theorising what Amazon’s manufacturing debut would look like provides a stark learning opportunity for traditional manufacturers. After all, wheneverAmazon has entered the fray in other traditional industries such as retail and logistics, the sector has never remained the same again. The key takeaway for manufacturers is that now is the time to start leveraging the sort of technologies and approaches to data management that Amazon is already doing in its current operations. When thinking about how to implement AI and new technologies in existing environments, specific end-business goals and targets must be considered, or else the end result will fail to live up to the most optimistic of expectations. As with any target and goal, the more targeted your objectives, the more competitive and transformative your results. Once specific targets and deliverables have been considered, the resources and methods of implementation must also be considered. As Amazon did with early automation of their distribution and logistics centres, manufacturers need to implement change gradually and be focused on achieving small and incremental results that will generate wider momentum and the appetite to lead more expansive changes.
In implementing newer technologies, manufacturers need to bear in mind two fundamental aspects of implementation: software and hardware solutions. Enterprise Resource Planning (ERP) software, which is increasingly bolstered by AI, will enable manufacturers to leverage the data from connected IoT devices, sensors, and automated systems from the factory floor and the wider business. ERP software will be the key to making strategic decisions and executing routine operational tasks more efficiently. This will allow manufacturers to keep on top of trends and deliver real-time forecasting and spot any potential problems before they impact the wider business.
As for the hardware, stock management drones and sensor-embedded hardware will be the eyes through which manufacturers view the impact emerging technologies bring to their operations. Unlike manual stock audits and counting, drones with AI capabilities can monitor stock intelligently around production so that operations are not disrupted or halted. Manufacturers will be able to see what is working, what is going wrong, and where there is potential for further improvement and change.
Knowledge for manufacturing
For many traditional manufacturers, they may see Amazon as a looming threat, and smart-factory technologies such as AI and Robotic Process Automation (RPA) as a far off utopia. However, 2019 presents a perfect opportunity for manufacturers themselves to really determine how the tech giants and emerging technologies will affect the industry. Technologies such as AI and IoT are available today; and the full benefits of these technologies will only deepen as they are implemented alongside the maturing of other emerging technologies such as 5G and Blockchain in the next 3-5 years. Manufacturers need to analyse the needs which these technologies can address and produce a proper plan on how to gradually implement these technologies to address specific targets and deliverables. AI-based software and hardware solutions will fundamentally revolutionise manufacturing, yet for 2019, manufacturers just have to be willing to make the first steps in modernisation.