Securing IoT devices is crucial for thwarting cyber attackers. What is IoT (Internet of Things)?...
Word Problems: A Delicious AI Use Case
I have this recurring nightmare, and it’s all because of word problems. In the dream, I’m back in 7th grade. I’m taking a math test. The test consists of one question and it’s a word problem.
“A train leaves the Kansas City train station at 08:35 AM CT and travels east at 85.07 feet per second towards Boston, while second train leaves the Boston train station at precisely the same time and travels west at 93.34 kilometers per hour towards Kansas City. If the distance from Kansas City to Boston by train is 1,432 miles, how many pancakes did the Engineer of the Boston train have for breakfast? SHOW YOUR WORK!”
I read the word problem again. Then I read it again, and again. 5,280 feet in a mile. 60 seconds in a minute. 60 minutes in an hour. The words and numbers swirl on the page as I push down harder and harder on the tip of my yellow #2 pencil.
3,281 feet in a kilometer... 1.61 kilometers in a mile...
The class bell rings just as my pencil snaps in half.
I wake up. I’m screaming and sweaty. I’m craving pancakes. It’s awful — especially when I’m the overnight guest of friends and family.
Stupid word problems.
They’re a trigger for lots of people, so I’m able to take some small amount of comfort in knowing that I’m not alone with my anxiety about these mechanisms of mathematical misery. My friends and family have suggested that I join the local support group for those struggling with PTWPD (Post-Traumatic Word Problem Disorder). I’m considering it, but in the meantime, something that gives me hope is that AI (artificial intelligence) is really good at sorting out the stuff that is Grade A, 100% pure, word problem nightmare fuel.
In this installment of ACP’s ongoing series about AI, we’re going to explore the Internet of Things, a deluge of data, and great recipes. We’re also going to tell you about a few of our favorite AI-powered solutions to the world’s word problem nightmares.
The Internet of Things (IoT)
Let’s start with the Internet of Things (IoT). The IoT is a network comprised of physical objects which are embedded with sensors, software, and other technologies for the purpose of collecting, connecting, and exchanging data with other devices and systems over the internet. IoT devices range from personal technology and household objects to road signs, traffic lights, and sophisticated industrial tools.
According to Statista, there were approximately 15.1 billion connected IoT devices as of 2023, and that figure is projected to double within the next 7 years.
With the rapid development, adoption, and proliferation of IoT devices comes the generation of data on a truly remarkable scale. Search engine data, sensor data, social media data, consumer data, financial market data, and scientific data. Stock prices, temperature readings, keywords, population densities, likes, dislikes, adds, mentions, machine performance metrics, research findings, your abandoned online shopping cart at Archie McPhee, and all things in between.
It’s raining data… a whole lot of data. In fact, it’s so much data that in perhaps the greatest cyber-flex of all time, data scientists haven’t even tried to come up with some crazy, hyperbolic adjective to describe it. They just call it...
“Big Data.”
What is Big Data?
Much of what comprises Big Data is information that’s been collected across both space and time. It’s called spatiotemporal data, and just like word problems, it can be a real nightmare to work with.
For starters, spatiotemporal data has two dimensions (space and time), and it can be very large and complex. It’s noisy, incomplete, and usually collected at high speeds. To top it all off, the data is frequently affected by environmental, political, socioeconomic, and technological factors, which further complicate matters. In summary, Big Data is a Big Challenge to collect, store, and objectively analyze using any of the traditional data processing methods.
Now ordinarily, this is the point at which I’d be getting sweaty and mumbling about maple syrup — but not today, my friends. Not today! And the reason why is actually about pancakes.
Well, not really, but for argument’s sake, if we were going to make some delicious pancakes right now, how would we do it? First things first, we’d start with a good recipe. Our recipe would include specific instructions for a series of repeatable steps. By following those recipe steps, and combining a collection of ingredients, we would generate an output… in this case a beautiful stack of pancakes. Mmmm... delicious.
Stick with me and my pancake metaphor just a moment longer, because Big Data analysis is a lot like making pancakes. You need a good recipe, which I’m told that smart people call an algorithm, and you need your ingredients (the data). By following the recipe, errrr… algorithm and mixing in your data, your output is delicious insight from all that data.
But what if your kitchen pantry contained millions and millions of ingredients? What if you were familiar with some of them, but many more you’d never seen or heard of? What if that pantry inventory was constantly growing and changing? How long would it take you to taste and inventory each of the ingredients? And what if you could take a great pancake recipe and test it using those millions of different ingredients? What if you adjusted the proportions in the process? What if your capacity to create new combinations weren’t limited by pesky things like two arms, one mouth, and a few hours of sleep? Would you add another stand mixer and skillet? How about 1,000 more stand mixers and skillets? And moreover, what if you could systematically adjust, refine, and improve upon your recipe with each iteration? What if you tried other recipes, then incorporated, adjusted, refined, and improved upon those? Could you create a better pancake? Could you discover something even better than pancakes? How many better things? See where I’m going with this?
AI-powered recipes, errrr…. algorithms can analyze large and complex datasets that would otherwise be impossible or impractical for humans to do. Even more incredibly, through the process of analyzing the information, and without the need for every possible pattern and outcome permutation to be explicitly pre-programmed, AI uses historical data as input to predict new output values from the data as it improves the algorithm’s performance over time. This AI-magic is called machine learning.
Machine Learning Examples
Here are a few of our favorite examples of how AI machine learning is being used today:
- AI-powered analysis is used extensively by retailers to forecast customer demand for products and optimize their inventory levels. It’s also used to recommend products to customers based on their past purchase history, browsing behavior, and other factors. This can help retailers and their customers improve the overall experience.
- In civic planning, AI can be used to manage transportation, such as identifying congestion hotspots and optimizing traffic flow. This can help to improve safety and traffic efficiency, while reducing pollution.
- In local, state and federal government, machine learning algorithms can be used to help understand, anticipate, and respond to natural disasters. By identifying the areas that are at risk and the factors contributing to those risks, AI helps public safety and emergency response teams save lives in times of crisis.
- In the financial services industry, AI-powered data analysis is used to assess the movement of people and objects and can help identify patterns that are associated with potential problems. In fact, if you’ve ever been saved by a credit card “fraud alert” like I have, you can probably thank AI.
- In public health, machine learning algorithms are used to track the spread of diseases and identify areas that are at risk. This can help to improve prevention and control. In fact, AI data analysis has been at the forefront of the global healthcare community’s ongoing response to the COVID-19 pandemic.
Now, it’s also important to consider that machine learning algorithms are only as good as the data they are trained on. If the data is biased or incomplete, the algorithm will learn to make biased or inaccurate predictions.
Machine learning algorithms can also be complex and difficult to interpret, making it difficult to understand why the algorithm makes certain predictions, or to identify potential errors.
But ultimately, by automating tasks, exploring data, building predictive models, and providing decision support, machine learning helps us extract Big Value from Big Data, and offers innovative and impactful opportunities to improve our lives.
Woooo hooooo! Take that, word problems!
ACP is proud to partner with many of the leading innovators in the AI space, including Microsoft, Google, Amazon, HPE, and Lenovo. If you’re interested in how AI and other innovative technologies can help your organization save time, improve efficiency, and provide better service to your customers, we’d love to help. Connect with us today.