Information Systems Integration – Tony Messina

Jose Alberto Gil

Amazon’s Alexa could soon Translate Speech in Real-Time

Amazon is exploring real time translation abilities for Alexa. Currently the intelligent personal assistant is capable of only deciphering single words and phrases. Many companies have tried at creating real-time translating devices but Amazon believes that they can create the best one yet but improving Alexa to work seamlessly in conversations as a real-time translator. Amazon doesn’t just want it to simply translate language to language, the company wants Alexa to gain a well-rounded knowledge of the cultures in which the translated languages are spoken. By incorporating this knowledge into translation, the device could help users to communicate more respectfully and effectively. For example, the Spanish language is spoken in about 20 countries but in each one the dialect, and accent is different so simple translations services like google can’t fully understand this component and this is something that Amazon wishes to accomplish.

Amazon expects to have this technology available through any smart device to avoid the physical carrying of the device, so it’ll make the technology more accessible for people all over the world. Other companies have tried to create something similar but all have failed to develop a seamless device that’ll make real-time, conversational, and culturally-sensitive translations. Even if the technology is still a long way from happening, Amazon hopes that one day Alexa will be so advanced that a single device could translate multiple people speaking multiple different languages at the same time.

What do you guys think of this? As someone in our class previous mentioned in a post about the Babel Fish Earbuds technology where they hope to basically provide the same service, how do you see this type of technology advancing in the future?

Amazon Real-Time Translator

A Humanoid Robot Could Come to Your Rescue During a Disaster

The Italian Institute of Technology has been developing a humanoid disaster robot called the WALK-MAN. The main purpose of the robot is to help humans in disaster situations, and after years of development it’s is one step closer to fulfilling that as it’s in the final validation phase. Though the robot WALK-MAN isn’t fully autonomous, it functions by a human wearing a suit equipped with sensors which controls about 80 percent of its actions. The robot is put through series of tests as it navigates a disaster scenario designed to mimic an industrial plant following an earthquake, moving debris and putting out a hypothetical fire. With technology continuing to evolve the robot could eventually be helping in future disasters.

With the things we see from Boston Dynamics and their robots, to Sophia the robot with its advanced artificial intelligence it’s scary to think about how these things can evolve when eventually they come as one. Similar to the movie iRobot where artificially intelligent humanoid robots are serving humanity, this could be something in the foreseeable future. What do you guys think? Does having humanoid robots working alongside humanity is a good thing or not? How far should we go in giving them a mind of their own?

A Six-Foot Tall Humanoid Robot

How Amazon runs a grocery store with no lines, cashiers, or registers.

Amazon Go which officially opened for business in Seattle in late January, is Amazon’s testing ground for a physical store. The Amazon Go store is similar to any other convenience store we know for example Wawa, but it has one key difference – it has no checkouts or human cashiers. So instead of waiting in line to pay, customers can just download the Amazon Go app and their Amazon account is automatically billed based on the items they’re carrying when they leave the store. So, people ask how is this possible? This is the technology making it possible – Amazon has created a system that’s runs a high surveillance in the store with hundreds of cameras placed in aisles and shelves. But they don’t use facial recognition technology, they use a more sophisticated technology called computer vision. Computer vision allows machines to basically see what is in front of them and determine what an object is and detects when an item has been taken from a shelf by a customer and who it was taken by. And it doesn’t stop there, the system is also able to remove an item from a customer’s virtual basket if it is put back on the shelves. With this system, Amazon is able to bill the right items to the right people when they walk out with their network of cameras as they’re able to track people in the store at all times.

The Amazon Go store was rumored for years, and when it first opened it was made available for the Amazon HQ employees. But now that it’s gone public, what future plans do Amazon have with it? And with their recent accusation of Whole Foods in the past year, will they implement this system to those stores as well? What could this mean for the future of human workers?

http://www.wired.co.uk/article/amazon-go-seattle-uk-store-how-does-work