Last year, we put out a theory on edge computing and published a white paper addressing the growth of cloud computing and the need to rethink the level of redundancy required for the physical infrastructure equipment remaining on-premise, or at the edge.

Now we want to challenge the theory and find out if it’s true.



Although we could attempt this challenge in many ways, we kept coming back to the retail industry where Schneider Electric has been doing a lot of work. To us, it’s a fascinating case study because all the retailers seem to be doing the same thing: trying to figure out how to survive against Amazon in a time of record store closings and vast changes in shopping behavior.

IoT, Augmented Reality, & Artificial Intelligence: Pick your Favorite Buzzword

With so many retail chains closing their doors, it’s life or death for these companies and they all seem to believe one basic thing: technology may be their way home. If they apply new technology, they should be able to give their customers a unique, in-store experience that Amazon cannot replicate online. They are fighting back with – pick your favorite buzzword – IoT, Augmented Reality, and Artificial Intelligence. They’re all being used and retailers are scrambling to figure out how to do it.

And there are stumbling blocks. During a network outage, 72 percent of retailers say they lose sales. And, 82 percent say that network downtime results in negative customer experience. More importantly, their customers are saying it as well. When it comes to negative in-store experiences, 31 percent of consumers told family and friends about their experiences. Let’s face this fact: if people have negative in-store experiences, they will talk about it. And 48 percent of consumers will then avoid those stores based on the negative experiences of others.

So, clearly businesses are struggling to figure out how to survive while facing seismic shifts in consumer buying – this convinced us to address what’s happening here.

This is not Star Trek Type stuff

Kevin BrownThis new digital, in-store focus for retail brings up lots of questions. First, is retail really deploying new technology? If so, is it having an impact or changing something on the IT systems? Are we seeing changes to the infrastructure as a result and is it changing the requirements for availability of their local edge?

Here’s an easier question: when’s the last time you went shopping? You may have noticed the re-imagined brick and mortar space is becoming a pretty interesting place. Technology that is already being rolled out includes a magic mirror where you can try on clothes and makeup by simply standing in front of it. At McDonald’s, you can place your order at a stand-alone kiosk. At Amazon Go stores, there’s nobody to check you out. Think about the technology involved in walking out of a store without checking out!

I’m talking about technology that’s already happening in stores. There’s a lot more in the works: mannequins that assess your facial expressions and your emotional reaction to an outfit with Artificial Intelligence; facial recognition of VIPs so if you walk into a store, personnel are alerted and you are greeted by name and shown items you may want to buy based on your purchasing history. Think about the difference in your in-store experience if someone says, “Here are three pairs of shoes we think you might be interested in.”

This is not Star Trek type stuff. This is all in the realm of reality and any interruption in this new technology is likely to have a major business impact.

How Our Resiliency at the Edge Theory is Proving True

We want to know what all these changes will mean for IT in the retail space and what the infrastructure will need to look like. It’s clear the reliance will continue to drive IT and we’re moving to a lot of dual network connectivity using cell phone wireless as the backup. Maybe that becomes primary in the future but either way the network redundancy is becoming critical. We might not see generators on site but we might see putting in two hours of run time using li-ion batteries. The infrastructure doesn’t necessarily have to be the same as what we had in the data center. It’s going to look different and we must start comprehending the amount of insurance policy a retailer should buy to maintain the in-store experience they’ve worked so hard to deliver.

We’ve helped several major retailers provide more efficient deployment of new technology for edge data centers connected to the cloud. We learned many things, including that it’s not necessarily the amount of power that makes something mission critical. If a network closet or server room or micro data center in the back of your store is the connection point between your customer and their experience, it becomes mission critical regardless of the power consumption.

As our white paper on edge computing states, “Use of cloud computing by enterprise companies is growing rapidly. A greater dependence on cloud-based applications means businesses must rethink the level of redundancy of the physical infrastructure equipment (power, cooling, networking) remaining on-premise, at the edge.”

Using retail as a case study, lots of signs point to our theory proving to be true.

Learn More About the Importance Resiliency at the Edge

Watch this video of the Schneider Electric Meet The Expert series to learn why server rooms and edge closets dominate system availability and why, if not managed properly, these mission critical micro data centers could be your weakest link.

In my last blog, I talked about some of the hype going on in the industry over artificial intelligence (AI) and machine learning (ML). The blog gets into what the current AI techniques are capable of on a fundamental level today, and I also offered a definition for these often-misunderstood terms in the context of data centers. In this post, I’d like to develop the discussion by describing 3 key challenges that the industry needs to address and resolve if AI tools are to be broadly adopted to achieve their full value for colocation providers.

Three Things to Overcome Before Broad AI Adoption

Conceptual background- Artificial intelligence / humans and cyber-business (detailed with millions of small binary code)

The first challenge is instrumenting the data center. The old adage “garbage in, garbage out” applies here more than ever. Despite their “black box” nature, machine learning algorithms and deep neural networks are not magic. Like any analytics engine, they need large volumes of good data to act on. Those with well implemented DCIM suites are probably in good shape. But part of this challenge of instrumenting the data center falls also on vendors of the equipment. Does their hardware collect and report the info necessary to make the algorithms work? Schneider Electric has long been digitizing and instrumenting their UPSs, Cooling units, PDUs, Switchgear, etc. – ahead of the game than others in the industry. But, as we develop AI use cases and the algorithms to support them, we may realize we need new sensors in new places that we don’t yet have today. For example, maybe we find by having a vibration sensor in a different location it might give us a little more proactive visibility into the life cycle of that system that we don’t have today. Things like this will evolve over time.

The second challenge is that traditionally this data has been living in disparate systems. Facility data lives in the BMS, power quality information in the electrical power monitoring system (EPMS), information about the white space infrastructure in DCIM tools, and IT software/virtual resources in IT operations management tools. For the system to understand all the critical variables and how they are connected and impact each other, this data should be consolidated and put in the AI model. Consolidating all this disparate data is still a challenge not yet fully solved. The new Schneider Electric EcoStruxure™ System Architecture and Platform, however, goes a long way to solve this challenge. Without consolidation, AI applications are limited to much more narrow functions like air handler optimization or early warning on cooling unit fan failures. These are useful functions, of course, but they are not earth-shattering.

The third data challenge is what we refer to as data integrity. All this data needs to be correlated to each other and there needs to be context; the model needs to know where the data comes from exactly. For a given dataset from a specific asset, the model might need to know things like: site, room, row, rack, U-space, power path, network port, and policy requirements. The time periods need to be synced in some way. DCIM tools require having this all mapped out and defined, but it takes a lot of effort and resources to set that up initially and to then maintain it as things change over time. It is largely up to us vendors to try to simplify all of this and hide the complexity.

What’s Next for Colocation Providers Considering AI Driven Data Centers

The point here is that these challenges exist and still need to be figured out before AI use in data centers becomes common practice and colocation providers can best apply the benefits. Having a well-implemented and maintained DCIM system is a key first step for colo providers. Such a system will provide the necessary metering and contextual data that will make AI tools effective. Stay tuned for my next blog where I’ll share how we think AI will be applied specifically in colocation data centers in the near term – as well as why AI’s reliability will have a bigger impact to their business growth.

In my previous blog (Part 1) I introduced the energy paradox which cites world energy use as having  increased by 50% over the last 25 years, and yet there are two million people on our planet that still lack access to reliable electricity. The other part of the equation is that energy demand will increase by almost another 50% by 2050[1].

As a planet, we are consuming more and we need to reduce our waste, recycle, and both decrease our carbon intensity and improve our efficiency by a factor of three to avoid significant and irreversible damage to our planet. When you look at it this way, you get more than a paradox but an urgent call to action.

We already looked at ways to make your industrial operations more sustainable, and now I’d like to look wider across the business at strategies to save energy and to consume it smarter.

Optimized supply chains improve sustainability

Improving production by location and, thereby, reducing transportation is another way to “produce better” and improve the sustainability of your business. The automation of low value-added skills in conjunction with the upskilling of plant personnel and operators across all geographies and economies heralds a new vision for an industrial world where producing on the other side of the planet in order to reduce labor costs is no longer as viable as it once was. Driven by neo-protectionist and “hipster” trends of producing and consuming locally, industries that can embrace this idea will reduce the footprint of their intercontinental supply chain (and potentially appeal to a broader market, as a result).

Sustainable customization

One of the drivers of the digitization of industries is the increasing consumer demand for customized goods. Delivering mass customization, or “lot sizes of one,” becomes much more viable thanks to the increased flexibility of digitized processes and technologies like 3D printing. Add to this a complete set of analytics to optimize planning and scheduling and you gain a high level of flexibility and agility to deliver on a rising consumer demand for goods that are designed and manufactured sustainably with limited waste.

 Producing, consuming, storing, and selling your own electricity

Decentralized energy usage is another option that industrial producers can consider in order to be autonomous when the prices of different types of energy (oil, gas, or coal) reach pitch. The idea is to have access to alternative energy sources and reduce CO2 emissions. The implementation of a micro grid for industrial sectors proposes that industrial companies be aware of their global energy consumption and select the energy type they need. Playing an active role in the demand response of energy will make energy usage more efficient, and not just for building use or replacement on the grid, but also for capacity planning and to ensure a continuous, reliable energy supply.

As I mentioned at the end of the last blog, industry and sustainability are no longer mutually exclusive. Industrial producers now have tools at their disposal to help them make “sustainability decisions” without compromising their businesses. In fact, in many instances, sustainability actions, which are good for our planet, can also bring with them new business opportunities and attract new consumers who share this value. It’s a shift that may take some time and effort at the start, but the payback will come on multiple levels.

To find out more about sustainability at Schneider Electric, click here

To learn more about our industrial automation solutions, click here

[1] McKinsey Energy Outlook, 2016; IEA WEO 2016, Current Policy Scenario (Business as Usual)

Human-machine interface (HMI) software serves as a communication bridge between machine operators and the system to manage and control operations. Some versions of HMI software also translate data from industrial control systems into human-readable visual representations of the systems.

The advanced HMI softwares gives the leverage of seeing the schematics of the systems. It can also be used as an access control system to turn switches and pumps on or off. For instance, the machine operated controls can be brought into use to raise or lower temperatures of the air conditioning system connected through a cloud computing system. HMIs are usually deployed on Windows-based machines, communicating with programmable logic controllers and other industrial controllers.

Machine learning is a subset of artificial intelligence that follows an iterative learning mechanism. Machine learning is vital because, as the system is exposed to a new set of data, for the same query, they are able to adapt the behaviour of the operator. The system then learns from previous computations to generate reliable, repeatable decisions and results that are acceptable in robust computation and operation of the connected devices.

As the system grasps the expected operations, the machine is able to make predictions based on massive amounts of data. This operation is highly driven by artificial intelligence, and its branch of that deals in pattern recognition. The system has the ability to draw knowledge from experience independently. For this reason, this technology has successfully drawn importance in industrial processes.

The HMI software when integrated with this technology helps efficient and error-free functioning in normal conditions. The set-up can not only be used to predict the required behavior and also to forecast the system defects. This means that breakdowns can be forecasted and preventive measures can be taken before any major breakdown in the machinery happens. Thus, solves the purpose of additional predictive maintenance mechanism to be plugged into the system.

One of the many purposes it serves is increasing the throughput and reducing manual efforts. This happens because the machine operations are accurate and timely. Also, self-operated machine actions reduce the human efforts that can be deployed in other tasks.