Is AI the ICT professional’s friend or foe?

July 17, 2023
Artificial intelligence isn’t likely to replace the jobs of cabling-system designers and installers. But it could help map out optimum cable routes, and undoubtedly will mean more bits and bytes flying through data centers.

Artificial intelligence (AI), with its subset machine learning (ML), is one of the fastest-growing technologies getting much attention with doomsday scenarios surrounding deepfakes, deception, bias, and other ethical concerns. In May, the White House Office of Science and Technology Policy (OSTP) released a request for information on national priorities for AI, citing that although the technology can improve lives and solve tough global challenges, it poses “serious risks to democracy, the economy, national security, civil rights, and society at large.” Despite the risks, the technology is here and skyrocketing in use among tech giants and across every market—from finance, healthcare, and manufacturing to transportation, retail, and entertainment. According to IBM, 35% of global businesses have embraced AI, with 63% of IT and telecom organizations already using the technology.

Glenn Adair, A&E program manager for security and surveillance technology provider i-PRO Americas (formerly an entity of Panasonic), says it’s essential that ICT professionals have a good baseline understanding of AI and its impact on the design and deployment of the data centers and LANs—even if they aren’t using the technology themselves. To that end, Adair will be presenting “Demystifying Artificial Intelligence and Machine Learning” on Tuesday, September 12, at the BICSI Fall Conference & Exhibition in Las Vegas. Actually, you can expect to hear plenty of references to AI at the conference—from the opening keynote speaker to experts from Corning covering data center design for AI applications.

All about algorithms

Most sources refer to AI and ML technology as systems capable of simulating human intelligence and thought processes. While that definition feeds the notion of robots taking over the world, the technology has been used for decades—long before IBM’s supercomputer Deep Blue defeated world chess champion Garry Kasparov in 1997. According to Adair, the first AI algorithms were more deterministic, similar to decision trees used in flow charts, and used mainly for categorizing data. “Early algorithms were a lot like playing 20 questions, and if we didn’t get the right answer, we adjusted the behavior to ask the right four or five questions until it found the right answer,” he says.

Adair says one way to look at the AI subset ML is algorithms written backward. For example, rather than telling an algorithm to multiply A by B to get C, ML involves asking a machine to achieve C based on A and B but doesn’t tell it how to do it. “In ML, the machine tries various possibilities until it comes up with the right answer, and the more data, the more it figures it out,” explains Adair. “Because we don’t understand how the machine came up with that answer, it’s opaque, and we never know if we’re always going to get the right answer. That’s where the technology has its limitation.”

AI algorithms are typically categorized as supervised and unsupervised learning. Supervised learning relies on clearly labeled data for either classification or regression analysis. Regression analysis is essentially a mathematical determination of the relationship between two or more variables, one of which is typically a dependent variable. Examples include estimating a value (e.g., ROI), predicting probability (e.g., medical diagnosis, market growth, etc.), or identifying a boundary (e.g., an acceptable amount of error). One type of regression analysis that most of us are familiar with—even if we don’t realize it—is Bayesian modeling which looks for similarities to determine probabilities. “It’s how Netflix suggestion models work,” says Adair.

Unsupervised learning finds patterns and relationships in unlabeled or raw data. Bayesian modeling that looks at similarities can also be unsupervised, such as clustering algorithms that analyze and organize raw data points into meaningful groups. Spam filtering and targeting in marketing are two examples of clustering. Ironically, clustering is even used to examine content and identify deep fakes.

The most complex AI type that typically combines unsupervised and supervised learning to closely mimic the human brain are neural network algorithms—think self-driving cars, Siri, facial recognition, disease diagnosis, and generative AI like ChatGPT. These algorithms are considered deep learning and rely on extensive data sets. “Neural networks are really quite sophisticated, and we’re very interested in using the technology in our security solutions for perception capabilities, such as behavior recognition and object recognition,” says Adair. “We can leverage this technology to detect anything and everything, but we still tell it what we’re looking for.”

According to Adair, there must be a level of uncertainty for neural network algorithms to work. “I know it sounds counterintuitive, but if the algorithms always achieve 100%, they don’t know what to do when there’s an exception,” he says. “We want to allow a little error. After all, humans make mistakes too. The goal is to drive the error down to less than human error, but not so far that it can cause a catastrophic failure when it encounters something it’s never seen before.”

Potential industry use cases

AI offers tremendous potential for ICT and the building construction industry. It is already making cybersecurity systems more resilient by identifying anomalies within system usage patterns to identify potentially malicious activities and data breaches. Microsoft’s Cyber Signals program uses AI to detect malicious activity and software-related weaknesses, and reports that they have successfully blocked over 35.7 billion phishing attacks and 25.6 billion identity theft attempts. AI also has the potential to automate many common IT processes, such as compliance and auditing, service requests, and asset management functions like equipment configurations and software upgrades.

In the smart building industry, providers like Siemens, Honeywell, and Johnson Controls are integrating AI algorithms into their building management system (BMS) platforms to make decisions that reduce cost and achieve energy, environmental, and occupant experience goals. For example, it can help prevent system failures by analyzing patterns to detect and diagnose anomalies within HVAC or other equipment or optimize space based on analyzing historical occupancy data.

AI can also help designers better plan. Britain’s new HS2 high-speed rail line leveraged AI to optimize signage placement by analyzing traveler emotions to determine when and where they were confused. “I believe we’re going to see AI increasingly used in predictive models for inspection and planning,” says Adair. “It could even be used for network infrastructure design to discover the most efficient cable route with the least amount of material.”

Ashish Moondra, Chatsworth Products’ director of strategic alliances and electronics product management, anticipates AI will even make its way into data center infrastructure management platforms. “Data center managers still want control, but as they become more comfortable with AI, we might see it used for more advanced predictive power modeling and data center design based on data gathered from intelligent PDUs, UPS, environmental sensors, and other devices,” he says.

More data and compute brings more opportunity

As a cabling infrastructure professional, you might be wondering how AI and the data required will impact the design and deployment of the data centers and LANs you deal with on a daily basis. Adair says ICT professionals should be more excited than concerned. “It’s going to mean huge opportunities because it’s all about the quality and quantity of data, and that will drive significant data center development,” he says.

As more building systems leverage AI, they’ll need more data, requiring more connected endpoints and sensors in more locations. That can drive increased deployment of wireless technologies and single-pair Ethernet in the LAN. With so much data to analyze, data center equipment running AI algorithms also requires nearly three times more power than traditional computing functions, demanding as much as 30kW per rack or more, with high-performance computing at the hyperscale and cloud level seeing rack densities of 50kW or more. These higher rack power densities will significantly impact data center design and drive the need for more advanced thermal management and cooling technologies such as liquid cooling.

Despite the opportunities, Adair points out that AI is still driven via human input. “the ‘A’ in AI should be apparent. Machines have no desires or wants, and it’s up to us to drive what we want them to do based on the data we feed them. It’s not very useful if the algorithms don’t know what to look for,” he says. “In the meantime, we need a body of standards to ensure that algorithms aren’t running unsupervised and creating security vulnerabilities that aren’t transparent to humans.”

Sponsored Recommendations

Cat 6A Frequently Asked Questions

April 29, 2024
At CommScope we know about network change and the importance of getting it right. Conclusion Category 6A cabling and connectivity.

Revolutionize Your Network with Propel Fiber Modules

Oct. 24, 2023
Propel Fiber Modules are your gateway to the future of connectivity.

Elevate Your Network with Propel High-Density Panels

Oct. 24, 2023
Propel high-density panels are designed to adapt and grow as your needs change.

Constellation™ - Explore power and data products

Oct. 24, 2023
Discover the Essentials for Building Your Power and Data System!