Monthly Archives: January 2018
Legacy IT is always defined as unnecessary complexity and cost, which hinder productivity, innovation and increase IT budget to maintain them. Modernization becomes the key to success. Modernizing means moving everything to the cloud, and moving application portfolio to software-as-a-service offerings will be the next phase.
Cotti-Osmanski, who works at a company providing contract resource services to the pharmaceutical, medical device, and biotechnology industries. She takes a strategic approach to value a modernization project to the company. “She considers dollars when thinking about the ROI of replacing legacy. And she weighs compliance, data integration, and security factors as well as whether a modern replacement will better support innovation”. Considering the hardware underneath it and pushing the major data center to the cloud are also crucial. Replacing legacy systems can be very expensive, the best solution should create the best value for a company.
So what do you think about the key considerations to modernization? Can legacy IT systems be re-used?
Please feel free to comment.
A small team of journalists and software engineers are working on a computer system that can pick out interesting data trends and generate short stories based off the trends. From the article, Peter Clifton, a journalist on the team, stated “We’ve just been emailing [local newspapers] samples of stories we’ve produced and they’ve been using a reasonable number of them.” Clifton expects to distribute 30,000 news articles each month. Robo-Journalism is gaining popularity in the US, and machines are showing they can successfully generate new much faster than humans.
The article also mentions that tools are being worked on to conduct interviews. Machine Learning technology is displaying its capability to do a lot of things journalists do. However, Joshua Benton of Harvard University’s Nieman Journalism Lab thinks that AI is very far from being able to completely replace journalists. He believes AI may become a powerful tool for journalists to quickly generate news based off of popular data trends, but good journalism is a craft that machines are not capable of yet.
Do you think AI could replace journalists? Would you care if your news was written by a robot?
Feel free to comment.
Blockchain technology is a digital, decentralized, and distributed transaction ledger with a growing list of records that is linked and secured by using cryptography. Due to its recent growth in popularity, corporations are starting to understand the technical capabilities for blockchain technology and finding use cases for it, especially in the financial services industry. Currently, the third largest cryptocurrency Ripple (XRP), with a market cap of approximately $50 billion was founded in 2012. Ripple has gained a tremendous amount of popularity since its one of the most scalable cryptocurrency. XRP connects banks and payment providers through its blockchain, RippleNet, simplifying global financial transactions. It’s advantages are quicker transaction times, lower transaction costs, and tracking transactions on its network. In comparison to Bitcoin, cross-border payments could take hours that is associated with ridiculously high transaction fees while Ripple can be completed in a matter of seconds for a fraction of the price. It has locked partnerships with large organizations such as American Express, Santander, and other financial services companies in Europe, Japan, and South Korea.
In the future, do you think Ripple gain mass adoption and replace our current financial system? What are some of the risks if there was a full scale adoption of Ripple? What are some other applications can blockchain technology be used for?
Artificial Intelligence is rapidly evolving as it becomes more developed. AI is going to eventually change the way information is gathered and decisions will be made differently. AI already has the ability to automate tasks that are regularly completed thus increasing efficiency and leaders are starting to use it. However, the shift to the use of AI will greatly affect the consulting industry. AI will be able to use data analysis and presentation in order to advise these corporate clients and although consultants are very intelligent at they do, artificial intelligence is better.
AI and other machine learning applications can analyze large amounts of structured and unstructured data and produce advice in a short amount of time for a cheaper cost than consultants in the market. They are also capable of building computers models that contain complex information because it is able to detect patterns from data. This process is very difficult for even the most intelligent and largest consulting firms.
Many corporations are soon to also rely on AI to manage human capital. Many of decisions made within corporations for example, mentorship, promotion, and compensation, are political. These biases negatively affect how many groups of people are managed and it results in organizations lacking in the ability to appropriately recognize and reward performance.
What will happen to the consulting industry? Will consultants be completely eliminated due to artificial intelligence?
Virtual reality is continuously battling against augmented reality. The complexity of creating VR games has caused the overall amount of game releases to decline. In comparison, AR game releases are remaining steady. VR is the only medium that ensures that users will be fully immersed in what they play. It’s literally impossible to look away or do other activities. The biggest weakness of the VR headset is also its immersion because it separates the user from the real world completely. AR games are used to incorporate and enhance the world around you, but lack in its execution.
Top 8 VR and AR headsets
- 33% — HTC Vive
- 26% — Oculus Rift
- 20% — Sony PlayStation
- 18% — Microsoft HoloLens
- 11% — Samsung Gear VR
- 10% — Magic Leap
- 9% — Google Daydream
- 5% — Google Cardboard
Do you agree with this? Should we branch gaming into VR and AR headsets or will this be another short-term trend?
Comment your opinions on this article.
Boston Consulting Group and MIT Sloan Management Review researched, Reshaping Business With Artificial Intelligence. The research found significant gaps between companies who have already adopted and understand Artificial Intelligence (AI) and those lagging. AI early adopters invest heavily in analytics expertise and ensuring the quality of algorithms and data can scale across their enterprise-wide information and knowledge needs. The leading companies who excel at using AI to plan new businesses and streamline existing processes all have solid senior management support for each AI initiative.
- 84% of respondents say AI will enable them to obtain or sustain a competitive advantage.
- 83% believe AI is a strategic priority for their businesses today.
- 75% state that AI will allow them to move into new businesses and ventures.
The research shows that AI will be the catalyst of entirely new business models and change the competitive landscape of entire industries in the next five years.
Do you agree with this? Should AI be adopted into more companies? What are some ways AI can benefit a company?
Starting on May 9th, 2018, companies working in critical services across the UK will have to make themselves compliant with new data protection regulations. These regulations are put in place to force companies to become more serious about protecting themselves and their customers against potential cyber security breaches.
Critical services include organizations working with energy, transport, and water and health, among others. Fines of up to £17 million ($24 million) can be inflicted upon companies that don’t meet the basic security requirements. These requirements include having the proper personnel in place, having the right attack detection techniques, as well as having the right attack mitigation techniques in the event of an attack. If an organization has to be notified of an improvement that must be made, a fine can be given as a”last resort” to get organizations to pay closer attention to protecting their systems. Is it fair to fine companies millions for not having their IT security completely up to par? Perhaps inflicting heavy fines could encourage a better cyber security infrastructure for companies.
In another article covering this same topic, the author presents the idea that perhaps cloud computing and the vital information and services it protects and provides are as important as power, transport and fresh water. Is this comparison accurate, and if so, why? It is key to think about the importance of protecting the services that we that we depend on for our basic needs, such as water and healthcare.
Lastly, the US is more focused on newer infrastructure development as opposed to becoming tighter with security regulations. Is this a good idea?
Feel free to comment any thoughts or questions you might have.
With the rise in popularity of cryptocurrencies like Bitcoin, buzz words like Blockchain are being tossed around often times incorrectly. Blockchain, in my opinion, is an extension of the internet, it’s the process of maintaining a public transaction log in a transparent fashion, like blocks. Bitcoin was the first decentralized digital currency to utilize this idea and it is the basis which ensures that if someone sends/pays for something with a bitcoin that they actually own the coin and they’re not trying to process duplicate transactions while a server processes the requests (One of the current headaches in the banking industry).
How does this relate to ERP? Maintaining, transacting and verifying “resources” is the main purpose of current Enterprise Resource Planning. However, their are great inefficiencies in the way different companies process orders between each other. If a system could be built so all “resources” were stored (in an encrypted manner) on a public ledger, then direct business-to-business and business-to-consumer transactions could occur between all companies utilizing this system as their ERP, with the benefits of: almost zero fraud, real-time tracking, and increases in efficiency.
As MIS students, it’s important for us to understand the most recent trends in technology and information systems. The Blockchain is here to stay, if we’d like to remain competitive in the workforce, we need to familiarize ourselves with cutting edge technologies.
I love all comments and criticism. Thanks for reading!
It’s certainly difficult to contend that AR and VR won’t be prevalent in all aspects of our lives in the future, and our own information systems as well. Researchers at the VTT Technical Institute in Finland are studying how the relationship between an everyday consumer and technology (in this case, AR/VR), can be verified by human senses and emotions. That’s where VR and AR come into play.
Imagine if a construction worker wearing an AR apparatus (likely goggles or glasses) can receive a visual error message if a valve is turned incorrectly or in the wrong order? Auditory and haptic feedback (vibration/touch) indicators can also be used in a response to an action. Other uses for AR in information systems was the distribution of single-use passwords to users’ AR optics.
What do you think of the role of AR/VR in information systems? The intersection of this technology with this industry is inevitable. What other uses do you think augmented and virtual reality can perform?
The IT world is filled with a variety of work structure methodologies. Terms like Agile and Waterfall make up just two of a number of ways to approaching the software development and IT life cycle. A new methodology named ‘DevOps’ has gained a lot of steam in the last five years and has really turned the industry upside down. The key differences between other IT methodologies and DevOps is the removal of barriers between development teams and operations teams, involving everyone to some degree in all parts of the IT life cycle. Some of the most important take-aways from DevOps are
- A more decentralized approach to project planning to incorporate more ideas and oversight in the development process.
- The idea of continuous deployment. Being able to update applications and launch new changes with 0 downtime in regards to the end-user experience. IE: No maintenance time..
- DevOps has evolved with cloud infrastructure explicitly in mind and is focused on helping organizations scale painlessly.
DevOps is growing as an enterprise IT methodology and will probably become more and more present in our lives as we get more involved in our workplaces, do you think decentralization is a good trend? How important do you think it is to remove silo’s between development and operations teams?
Feel free to comment.