2018 Tech Trends – A 360 View on Data

As the volume, velocity, variety and value of data grow, so do the possibilities of data. Deeper insights from Big Data that lead to breakthrough discoveries; real-time Fast Data analytics that keep people and information safer; intelligent machines that transform economies; immersive experiences that stretch the boundaries of reality.

As an industry leader in creating environments for data to thrive, Western Digital has a unique vantage point into the possibilities of data in 2018. Read on for a 360 view from our blog authors and experts across our company who talk about the top 2018 tech trends they see.

Janet George, Chief Data Scientist

As companies look to AI for a competitive edge, signal rich data will become key to the next evolution of analytics. Signal rich data is authoritative or labelled data, data that is rich with information and data that can be captured at the source, in as raw form as possible and unconstrained by specified queries.

As an industry, we initially approached analytics through Business Intelligence (structured data constructed for questions we wanted to answer). The Big Data revolution has allowed us to connect different data sets and schemas and mine the data for insights. Machine learning has brought forth feature extraction for predictive and prescriptive analytics, such as helping extract what features could predict the growth of the company (e.g. adding employees, locations, etc.).

“with Fast Data architectures, our approach to data will be changing completely”

Now we are entering the world of deep learning, where neural networks can automatically feature extract and learn from the training data set including inference at the edge which is continuous, incremental learning, once the training with large labelled data sets is completed. The challenge with enterprise data, particularly historical data, is that it is not labelled or signal rich. It is limited and constrained to particular queries, or databases, and entire data sets need to be recollected in order to complete the picture so that not just machine learning can be leveraged but more so for deep learning models.

[Tweet “2018 Tech Trends – why our approach to data will be changing completely”]

As the industry looks to collect raw data at the edge, in near real-time, with Fast Data architectures, our approach to data will be changing completely. Software-driven development, or “Programming”, will be disrupted and we will be adopting a world driven by learning. General purpose computing, storage and networking architectures will not be sufficient and following the emergence of GPGPU’s and TPU’s, we will see more purpose-built components and architectures to support Fast Data needs.

Vesa Heiskanen, Market Intelligence

No surprise here: most enterprise workloads will end up in the cloud. Meanwhile, the cloud itself will continue changing shape, becoming hybrid on/off-premises and stretching to the edges, to connect to consumer and industrial IoT. Edge computing will be one of the big topics of 2018. Storage opportunities expand with the growth of the cloud and new demand is taking shape in the edges.

Artificial intelligence is being added to consumer and enterprise applications that interact with us with increasing intelligence. Some major cloud and technology companies have already announced their AI-driven voice assistants and chatbots, and those that haven’t are likely to do so in 2018.

“start expecting all devices and services around us to know us and adapt intelligently to who we are and what we expect”

Over the last few years, we have learnt to expect having all information available through the smartphone at any time. Similarly, over the next few years, we will gradually start expecting all devices and services around us to know us and adapt intelligently to who we are and what we expect.

In the hierarchy of data (zeroes and ones) to information (data combined to make sense) to knowledge (information in relevant context), machines are moving to support us at the knowledge level, having been confined to the two lower levels up to now. It is up to us to make sure we bring this knowledge to the final level which is wisdom – the capability to use knowledge for beneficial purposes.

Narayan Venkat, Vice President, Data Center Systems

2018 tech trends? I predict three key technology trends for adoption.

The first is analytic applications that have more intelligence built-in. The adoption of machine learning (ML) in analytic applications will rapidly increase in 2018. ML will help automate basic data preparation and analysis, reduce time to insights, and help data scientists to ask more “what if?” questions of their data sets. ML adoption will increase both on the core and on the edge data centers for faster decision making. This will drive changes in type of infrastructure being deployed for ML. Expect GPU-based computing to increase substantially, adoption of flash media along with GPU-based computing for scaling ML and algorithmic processing

“As data generation and processing gets more distributed and automated, security concerns increase”

The second is that cloud architecture in Edge data centers will also see rapid adoption. Substantial amounts of Fast Data gets generated and stored in Edge data centers. Real-time decision-making gains more importance with IoT-enabled devices driving a change in the type of infrastructure deployment. Hyper-converged flash storage will find its way into more Edge IoT type deployments and become a key building block for cloud architecture scaling in the Edge data centers.

[Tweet “2018 Tech Trends – from #cloud to #edge data centers”]

Lastly, data security becomes paramount. As data generation and processing gets more distributed and automated, security concerns increase. Security threat detection will have to become more proactive. Security investigative platforms will aggressively use machine learning to identify and track various entities (people, programs and devices) that generate and interact with data.

Christopher Bergey, VP, Embedded Solutions

As I look to 2018 tech trends, I believe we’ll see significant advances in structuring video data (through the use of machine learning and artificial intelligence).

The evolution of video over the last 20 years has dazzled us. We moved from broadcast to time shifting to over the top/on demand We’ve witnessed the changes from analog to digital, SD to HD and 4K resolutions, to VR and beyond. Today we can create cinema quality content from a phone that fits in your pocket. However, most video in the world is still poorly structured data. Yet this is not too different than how much data was structured ten years ago, through hierarchical or manually tagged structures.

“It’s true you can “search” for videos or perform other web searches based on some generic descriptors, but you cannot query a video to show you all times a 6’1” male was wearing a yellow shirt for more than 3 minutes in a screen.”

It’s true you can “search” for videos or perform other web searches based on some generic descriptors, but you cannot query a video to show you all times a 6’1” male was wearing a yellow shirt for more than 3 minutes in a screen. Why would you want to do this? Think about a media company that wants to create a new ad. Or what if a store owner could ask its video system how many shoppers they had in the last 30 minutes? How many were female shoppers? How many purchased something? What was the average time in the store? Video clips of the five cars that were parked in the lot for greater than 30 min? The possibilities are endless and THIS is structured video.

Taking it one step further, what if we could do it in real time? What is you could ask a camera in your home to inform you when your 16-year-old son gets home between 3:40 p.m. and 4:00 p.m., and not confuse him with your 17-year-old daughter or 12-year-old son? The products that will equip us with this data are becoming available today and will become much more pervasive in our lives and businesses.

Adam Roberts, Fellow

With larger and larger data sets, moving the data to the compute location for analytical work or even for queries is starting to prove difficult. Data centers with compute capabilities closer to the actual storage media will see more and more popularity. This will drive a number of different architectural adjustments, including more DAS style hyper-converged solutions, storage enclosures with computer locally in the enclosure, and top of rack fabric attached storage. We’ll see new small form factors making the problem even more pronounced. Architectural adjustments to use these devices will be driven as much by reducing failure domain in storage as it will improving storage density.

“Data centers with compute capabilities closer to the actual storage media will see more and more popularity.”

Furthermore, the cost of compute and storage media as well as scalability with multiple sources will be a priority. The size of media is growing at such tremendous pace, the main target will be creating solutions that are scalable for performance and manageable with linear scaling as hardware is added.

Additional 2018 tech trends are that open standards and multiple sources will continue to be of strong importance. Non-proprietary management software will help deliver solutions at lower costs as the industry looks for more inexpensive solutions to tackle growing demands.

Stefaan Vervaet, Sr. Director Solutions Marketing, Data Center Systems

Across many industries the biggest driver in 2018 is digital disruption. New “born in the digital era” companies are changing business models and threatening existing entrenched companies. This disruption is actually accelerating the adoption of new technologies that can level the playing field, such as analytics and Big Data infrastructure. Data and analytics will play a key role in helping existing companies fight back by identifying new revenue opportunities and developing more customer intimacy.

“We will be thinking about data differently. Data will be viewed as a form of currency that has value.”

Will we be doing things differently? You know what they say – “Organizations will change less in 12 months than they predict, but they will change more in 3 years than they imagine.” In 2018 companies will be starting to do many things differently. Of course, we will be dealing with an ever-increasing data explosion. Data will be coming from new and diverse sources and will increasingly be non-text in nature. We will be thinking about data differently.

[Tweet “2018 Tech Trends – why we’ll be putting data first”]

Data will be viewed as a form of currency that has value. This means we will be thinking about how to get more insight out of that data. As a result, we will be putting data first. We will be defining data architectures instead of infrastructure architectures. We will be creating data strategies instead of storage strategies. We will be starting to experiment with analytics on our data to see how we can increase its latent value.

Walt Hinton, Senior Global Director of Enterprise & Client Compute Solutions Marketing

Two important trends will take over the data center. In the first hot tech seeks cool tech. In the second there’s a new MVP in town.

“NVMe™ is the MVP”

It’s getting hot in here. At least that’s what your servers and storage systems are saying about their data center racks. As the density of server and storage systems continues to improve, a penalty comes along – more heat. To combat this, watch for a resurgence in liquid cooling technologies including everything from emersion cooled servers, to custom liquid cooled data centers or even chilled doors for existing data center racks.

The second big trend is that NVMe™ is the MVP. Now with dual-port connections, watch for NVMe to take off for storage array use. The new systems will have lower latencies and much higher IOPs than arrays based on SAS drives. By mid-2018 we will also see a strong market for essential NVMe as it becomes a price/performance alternative to SATA SSDs.

Erik Weaver, Global Director, M&E Strategy & Market Development

As you look how storage has evolved in Media & Entertainment, there have been several shifts from traditional SAN & NAS architectures towards object storage, cloud and software defined storage. Right now we are beginning a further shift that abstracts things even further for the workflow to allow heterogeneous systems to seamlessly communicate with one another. This not only brings a new type of nimbleness, and greater ease of use, it also reduces the brittleness that existed in the workflow. File names and file paths are abstracted, so there is no need for versioning, upgrading, etc. The SMPTE standard 2114 will play a big role in enabling this next evolution.


Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.