In the wakes of the Facebook/Cambridge Analytica user data scandal, Facebook and other large tech firms are now facing additional problems. For Facebook, market capitalization continues to fall rapidly, and Mark Zuckerberg remains aloof. Shares have fallen 17% since the close on Friday, March 16th with no signs of recovering. Next, Amazon is facing heavy scrutiny from President Trump, who has expressed interest in creating an antitrust or competition law. Amazon’s shares in turn have dropped 5%. Finally, Apple is experiencing uncertainty among shareholders following poor sales of the $1,000 iPhone X.
Tech firms have experienced incredible growth this decade, but are now under tremendous pressure following the Facebook scandal. Apple, Alphabet and Amazon account for 10% of the S&P 500; add Microsoft and Facebook into the mix, and that number goes up to 15%. Technology firms overall make up 25% of the S&P 500, and current trends among these firms have resulted in the largest quarterly decline in the stock market since 2015. I for the first time in my life agree with President Trump’s idea of an antitrust law against Amazon, as they may be entering too many markets. What does the future entail for these primarily successful tech firms? Do you expect growth to resume, or are we entering a period of declines and extended uncertainty?
As global population continues to increase to record highs, scientists around the world have been struggling to solve our food crisis. We simply don’t seem to have the natural resources to support the number of humans on the planet, at least not at current consumption levels. Up until the advent of 3D printing, synthesized foods were never really much of an option. The focus was historically on industrializing the agriculture industry, which we have seen has had averse effects on the ecosystem. Now, the spotlight has shifted towards the potential of “designing” food rather than growing it. Meat and other foods (carrots are mentioned in the article) have already been produced in lab settings, and some companies are beginning to introduce these products into the public marketplace. A California company, Just, has suggested that it will bring its first meat cultured product to the marketplace by the end of 2018. A Dutch company, ByFlow, manufactures 3D food printers for just over $4,000. A lot of our processed “junk” foods are already manufactured, but this is the first time we are seeing plants and meats being artificially produced.
So my question to you, if McDonald’s or Wendy’s started to print their burgers would you still eat them?
People’s continued trust and usage of technology can lead to harmful consequences, especially in the case of prescription pills. Most people have a number of health care providers and the problem that arises is the incompatibility between the systems (i.e. a cardiologists office and a dermatologists office) when systems are incompatible, communication errors occur. In 2010, a study conducted found that 651 patients in a hospital accrued over 300 errors in their medication list; many of them able to be harmful.
This article talks about a fictional system MyRxCloud “a cloud-based, free, ad-free, voluntary, nonprofit mobile app (also available online) that can exchange information with existing electronic health records and does nothing more than keep accurate lists of all patient medications, including prescription and over-the-counter drugs, implants, nutritional supplements, IV solutions and injectables, such as insulin and heparin.”
We have talked many times in class about technology system upgrades within a company but what about a technology system upgrade in an entire industry? Do you think its possible that a single system could run through all hospitals and clinics to minimise technological incompatibility? What incentives would convince these institutions from changing from their current system?
Vivo, a smartphone brand owned by the same company behind Oppo and OnePlus, has announced the X20 Plus UD, the first ever smartphone with in-screen fingerprint scanner at CES 2018. Vivo X20 UD came with a 6.43-inch full HD+ OLED display with an 18:9 aspect ratio. It was using an OLED panel because the Synaptics tech relies on that type of display for the fingerprint scanner to work. The scanner underneath actually peeks between the pixels to identify the fingerprint.
In MWC 2018, Vivo showed its second in-screen fingerprint concept phone, Apex, has a ‘half-screen’ fingerprint scanner and a retracting selfie camera. The lack of bezels also means there’s no space for a conventional earpiece speaker. Therefore, Vivo’s approach here is to vibrate the entire screen itself like a speaker — you can still hear phone calls without holding the device to your head.
On March 19, 2018, Vivo announced the lunch of its third in fingerprint scanner smartphone, X21 Plus UD.
Do you think the in-screen fingerprint scanner smartphone is trend?
Recently, Pizza Hut has revealed that they will be teaming up with Toyota to come up with a self serving vehicle with the purpose of catering consumers in a number of ways. Toyota says that it will have the capabilities to host a mobile store and deliver packages. This concept of a servicing vehicle, which is developed by Toyota, this is called e-Palette. The vehicle is going to be electric and self driving. It will have a lower floor and an open interior which will allow consumers to be served and shop around. The length of it will be from 13 to 23 feet, just big enough to hold small stores. Toyota said that they will be teaming up with a handful of partners such as Amazon and other ride sharing services, meaning that these self driving vehicles could revolutionize they way consumers shop. Akio Toyoda stated that, “Today, you have to travel to the stores, in the future with e-Pallete, the store will come to you.”
How do you think e-Pallete will effect a consumers shopping patterns? What other stores besides Amazon and Pizza could you see utilizing this? Do you think this is a trend that will catch on or do you see it being something that will not last?
Artificial Intelligence and automation are drastically affecting the workplace. The constant rate of change is moving too fast for companies to keep up. One study done by the Center for Business and Economic Research claims that half of the United States jobs could be replaced by automation. This is due in part to how much data is available in our society. AI can use this data to learn and adapt, and eventually replace jobs.
On the other hand complete automation has a long way to go. Boyle brings up the issue Uber is having with their self-autonomous cars and deadly accidents. One of Uber’s vehicles struck a person and killed them in Arizona. Automation definitely has drawbacks and it is unclear what the future for autonomous cars is.
Is there anything people can do to stop automation? Do you think it will ruin jobs or is there potential to create jobs? Does automation cause more harm than good?
I have never seen anyone wear smart glasses in public, probably because smart glasses look too cool to wear. Most people do not want to attract everyone’s attention with those smart glasses looks like come from sci-fi movies.
However, Intel’s smart glasses “Vaunt” look a lot discreetly. It just looks like any other glasses. Vaunt can place messages and notifications onto users retina. Users do not need focus on the messages, and those messages are just there. When users are not looking at the display, messages disapply, so it would not disturb users vision. There are no buttons or gestures, so it is very user-friendly. I believe it would be a disruptive innovation. I personally want a pair of that because it is so convenient. Imagining you have a shopping list somewhere in your phone or pocket, you can just look at them through Vaunt while you are shopping now. On the other hand, the downside is no one knows that the people in front of you are listening to you or not. In today’s society, people play too much smartphone when we are around with our closest friends and family, and smart glasses probably make the situation worse.
Do you think it is a disruptive innovation? Do you want a pair of Vaunt?
Amazon is exploring real time translation abilities for Alexa. Currently the intelligent personal assistant is capable of only deciphering single words and phrases. Many companies have tried at creating real-time translating devices but Amazon believes that they can create the best one yet but improving Alexa to work seamlessly in conversations as a real-time translator. Amazon doesn’t just want it to simply translate language to language, the company wants Alexa to gain a well-rounded knowledge of the cultures in which the translated languages are spoken. By incorporating this knowledge into translation, the device could help users to communicate more respectfully and effectively. For example, the Spanish language is spoken in about 20 countries but in each one the dialect, and accent is different so simple translations services like google can’t fully understand this component and this is something that Amazon wishes to accomplish.
Amazon expects to have this technology available through any smart device to avoid the physical carrying of the device, so it’ll make the technology more accessible for people all over the world. Other companies have tried to create something similar but all have failed to develop a seamless device that’ll make real-time, conversational, and culturally-sensitive translations. Even if the technology is still a long way from happening, Amazon hopes that one day Alexa will be so advanced that a single device could translate multiple people speaking multiple different languages at the same time.
What do you guys think of this? As someone in our class previous mentioned in a post about the Babel Fish Earbuds technology where they hope to basically provide the same service, how do you see this type of technology advancing in the future?
Ubiquitous computing is a concept coined by former XEROX Parc chief technologist Mark Weiser which states that software engineering and computer science is made appear anytime and everywhere. In Bryan Gardiner’s article, he talks about JetBlue and how they use biometric boarding as opposed to a paper ticket or an e-ticket on your smart phone that allows. This boarding would allow the passengers to walk up to a gate, let a camera scan their face, and then proceed to the flight if the camera finds a match for their face.
This probably wasn’t Mark’s idea when he first thought of ubiquitous computing but that along with voice assistants and “smart” products in homes are early examples of ubiquitous computing. The advances of these “hands-off technologies” won’t make other technologies that require physical inputs such as keyboard and smartphones obsolete but we are now in the early days where technology is starting to “disappear” or weave into our daily lives to the point where we won’t even realize it.