
Automated Warehouse sibling site Design World typically focuses on motion systems in various industries and shares insights from experts who are advancing technologies to support faster, smarter, more efficient, and precise machinery.
During a recent webinar, its team discussed trends and technologies related to material handling systems, including robotics and conveyors in warehouses, production lines, packaging systems, and more. It also touched upon motion-adjacent technologies and systems that need to integrate seamlessly.
Design World recently spoke with Vikram Kolluru, a digital automation consultant at Belden Inc. He has extensive experience with automated warehouses, networking infrastructure, and material handling systems in different industries and at different stages of digital and automation maturity.
Technology advancements in material handling
What technology advancements are we seeing today regarding anomaly detection on conveyor lines, robotics, or other types of material handling systems?
Kolluru: That’s a very wide topic, and we can go into the history and come back. I can give you an example of a very big cosmetics manufacturer that used to make lipstick, and everything was automated.
However, they had always positioned a human at the end of the line right before they put the cap on to make sure that the lipstick tip looked to their standards. A person, a human, was always there, and at that time — I’m talking 15 years ago — they used to claim that no automation can replace human vision and detect if there are any defects like that.
But just look where we are today with all the automation systems that we have, with the vision systems, the machine learning, the deep learning. Probably they are doing a much greater job than a human would do, and at much efficient throughput, right?
So that’s been a long journey where we came from, and that’s the kind of application we’ve been seeing in most of these automation projects — material handling or process or anything — where the convergence of the IoT devices and the technologies that can process all this data in real time has been enormously helpful to everybody — production, distribution, operations — in every which way.
Tackling labor issues in warehousing
You’re saying even 15 years ago, people thought that technology couldn’t replace humans, but now we’re finding we have a lot fewer humans who will even be present, doing the work, so there’s no choice but to at least try out the automation.
Kolluru: Yes, that statement still rings in my mind because we integrated an entire line, and they still wanted this one person. The statement was, “You will never be able to replace this human who can detect that defect on the lipstick.” And I can assure you that right now, that person doesn’t exist on that line.
So, as far as anomaly detection is concerned, everybody has their own take, even with the new technologies. They are supervised, unsupervised, some go as far as incorporating AI and machine learning, but some could be simple scanners and sensors to detect the dimensions, weights, heights, or anything automatically.
We have installed systems where we used to fill a vial, and the quantity that we should be filling is in the microliters. It is such a small quantity, and there is a threshold. You cannot fill less than that because it was going to cancer patient research, so anything less would be of no use for what they were doing.
And when they used to make a batch of about 16,000 of these vial tubes, a good 5% would be waste, and they were very expensive. So the way we fixed that problem was to incorporate the right sensors in their vision systems — very highly accurate vision systems, which would make sure that the fill levels are within the threshold.
When we’re talking microliters, even the threshold of 2 to 3% is even more precise there. So these systems were capable of detecting those levels at that standard, and then proceeding with yes-or-no decision-making in real time as we were moving forward.
Now, we had seen humans do that verification before we made the change, and the results of the validation after [the change] were almost day and night. Just think of a human comparing those levels on 16,000 of those tubes compared to this vision system doing it automatically.
That’s the kind of implementation that we would want to use when it comes to improving quality as well as throughput, which obviously serves not just the manufacturer, but also the customer.
How to create precise systems

What kind of equipment, hardware, and design considerations do engineers need to think about, regardless of their industry, to create a precise system like that?
Kolluru: There are different things that we can put together. You can have machine vision, you can have robots. You can have AI and predictive analytics. You could have IoT devices all connected to the same data broker, and then you could have an overlay application that is contextualizing all the data coming in in real time.
Also, there is hardware nowadays that will help you in processing all this data right at the source, rather than taking it to a central location. It also saves bandwidth and the time to process this, right? So the decision-making has become extremely fast compared to what it used to be in the past.
You got your photo eyes and barcode scanners and vision systems that are of very high precision, sending a lot of data. And at the same time, you got all your computing systems that have gotten extremely fast to process all this high-quality data that is coming in. And all that combined together is going to help you in making these decisions as we are looking at it.
But in order to start, it is a two-way, two-pronged approach, I would say, because there is a set of responsibilities on the end user and also a certain level of responsibility on the system integrator to make this happen.
Let’s say somebody is working on a newer project. It’s very easy to fall for the shiny stuff. We go to all these big shows like, “Oh wow, look at those robots. Look at this, look at that. I need that. I want that.” It’s like a kid, right? We walk into it. I personally feel like I’m walking into Disneyland whenever I go to ProMat or MODEX or anything like that.
But guess what? It’s not straightforward. There is a little bit of homework that the end client has to do. They have to know that they need to have the right network behind it. They need to have the right switches, right Wi-Fi connections, right design in the warehouse or the production facility. So they have to provide all that to the system integrator to be able to accomplish what the end user is asking for. That’s the kind of homework we talk about.
At the same time, for the system integrator side of it, it is very important that they partner with the end client and guide them through why they’re doing what they’re doing and also explain, “These are all the things that we can bring in for you to be able to proactively detect that your process is going out of the band and you might want to proactively stop and look at the quality of what you’re doing.”
So it’s that kind of slow but collaborative approach that is needed. When it comes to the system integrators, they can focus on the emerging technologies, and the end user needs to focus on being open to doing this homework for the system integrator, for both these things to happen together.
The role of AI in warehousing
Speaking of shiny objects, how is artificial intelligence transforming anomaly detection?
Kolluru: Enormously. There is extra hype about what AI can do, which probably has been driven by what Hollywood has showed us. But at the same time, it is not just pure hype. There are substantive gains that we have made with AI and machine learning and deep learning, especially when it comes to anomaly detections.
The unsupervised learning abilities of these systems, just by looking at how the conveyor has been running for a while, and “wait a minute, it’s not running the way it was running the last five months. It’s something different.”
Or you’re looking at an HVAC system running for a while, and “wait a minute, this one is drawing more current than normal. Why is it doing it now?” Then going into why that has happened, and identifying that there could be a problem that should be taken care of all by itself, rather than a human taking any action, is where the value is, because they are faster.
A human cannot be standing all the time right in front of the place where potentially there could be a problem. You have systems that are always in place and continuously monitoring, and they have so much data to analyze and predict these failures that will help us take proactive actions and fix the systems, rather than, “The entire conveyor is gone. Now your production has stopped.”
So it’s not just the cost of the conveyor section that you’re replacing. It is the downtime that you have to account for, the impact of the business that you have to account for. That is where a major impact is made with anomaly detections and in prediction analysis.
Maintaining high-speed applications
We talk a lot about high-speed applications. How do anomaly detection technologies help keep up the pace?
Kolluru: I could take the example of a sortation system. The FedEx and UPS sorters, the long, super fast sorters that they are using nowadays, where you have vision systems on top of the conveyors, and all your mail is dumped — literally dumped, not one after the other, it’s dumped into the container — and you see these systems automatically sorted with the vision system detecting what the address is and where the destination is.
You would think, “That is a crumbled piece of poly bag. How is it able to read? How is it able to get that?” But the way the systems are now designed, it’s not just what it can see; it’s also about what it can estimate based on its past learnings.
So the way that those technologies now determine what is good and what is bad at the rates that you can imagine — how fast FedEx and UPS sort their millions of packages every day, right? So that is also something that we can apply for anomaly detection.
Also, the bottling industry, I worked on projects there. Even in those days, we used to have sensors which would check on the fill levels of the cans. And even at that speed, they were really good at detecting and ejecting the bottles at the same time. So the fill level is a presetting; if it meets the setting, ask, if not, go.
But with the AI and machine learning level of detection, you are bringing human interpretation into that. It’s not just one threshold that goes on and off. Sometimes you can also detect something that you cannot read.
During demonstrations, they often show you a crumbled bag of M&Ms, and they say, “Which flavor of M&M is it?” in a black and white picture. You cannot say, but these systems are trained enough, even in the crumbled fashion, they know what it is. And so those minor infractions, if they see anything, and if you’re able to detect them, that is enormous, because many times the human eye will miss that.
These sensors nowadays, they’re all embedded. They are in line. They are not taking up any space. They are miniature. They use edge computing to process in real time.
The end goal is the meantime between failures has to be very high. The OEE [overall equipment effectiveness] has to be high. The throughput has to meet and exceed the expectations. And I think we are getting there.
Making anomaly detection more productive
Do you have any additional guidance for engineers or system integrators so they can build more productively and optimize anomaly detection in material handling systems?
Kolluru: One is to work with the end user to integrate these AI-driven tools for predictive maintenance or demand forecasting and this anomaly detection that we’re talking about. It has to be part of the solution. Whether the end customer buys into it or not, that’s a different thing.
I think it is imperative now that you set the customer up for success for the future, because if they don’t do it now, five or 10 years from now, they will look already like a 20 to 30-year-old setup, right? So that is very important for them to do.
When it comes to machine vision and robots, I would say, deploy vision-guided robots for automated picking, sorting, anything in the warehouses. We’re seeing a lot of humanoid robots that are taking a very active role in doing repetitive tasks that humans could do. So these are probably holy grails, if you could call it, because they are a combination of vision systems as well as robotic arms as well as AMRs, all together combined in one. And that’s what we need.
And at the end of the day, they say they don’t even have to integrate with your ERP system. They work just like humans. Because humans don’t integrate with your ERP systems. Humans are humans, right? So they are designing these systems in such a way that they are a replacement for human actions. That’s another trend that we have seen, which is showing a lot of promise and that is taking off.
When it comes to IoT and connected devices, I would say, implement those sensor networks as much as possible, even retrofitting. There are many technologies out there that can be retrofitted for real-time monitoring. You need the right IT infrastructure, but very light nowadays, you don’t need an enormous amount of hardware to make this happen. Very compact devices, one or two devices are able to do this for you. We have implemented some on machines directly to monitor the performance.
Of course, the Wi-Fi and connectivity is the backbone for all these things. You can install any number of sensors. You can install any number of the servers and analytics and services. But if your network is lagging, if your network is dropping packets, then you’re failing yourself.
And, more importantly, make it modular, whatever we are installing, so that if there is anything we have to replace or scale up — that I would certainly push for.
But in every one of my speeches or talks that I do, I always end this with one statement: Don’t forget the worker, because you could have very advanced systems working at your site, but you still need somebody to troubleshoot them or be there with them. Does it mean upskilling of your worker? Yes, because that worker will be your Level 1 support.
Many of the OEMs or system integrators claim that, “We can fix the system on the phone. Just pick up the phone and call us and explain what the problem is.”
Guess what? Not everybody is even capable of explaining what the problem is with these modern systems. And so you need the right worker there.
In the past, we used to say, “I got 30 years of experience working on conveyors, and I can come into your job.” No, sorry, if you’ve been doing this for 30 years, then you may or may not be ready to work with AI.
And at the same time, it’s the other way around with, “I’ve been born and brought up in AI.” No, sorry. So, it’s that fine line that you got to walk and find the right resource for us to work with all these technologies.
There’s a lot that everybody overlooks. The end customer might expect that this is going to come here and do magic for me. And, no, that’s not exactly magic. It’s something you do.
When it comes to the OEMs and system integrators, you did the project, you installed everything, but who’s going to maintain it? And they always talk about training, but there are two different types of training.
One is operational training, the other one is maintenance training. Operational training is probably one of the easiest things nowadays, with so much advancement in the machinery, all you do is push a button. Or maybe you don’t, you just walk in and it detects that you are there and starts running — it is so good.
The problem is, when you push the button and something doesn’t happen, then you need to know, “What do I do?” Did you train the person to recover from there? That is a very critical thing that many overlook, and it’s got to be very important that you look at this holistically, not just the technology and installation or just the latest trend.
Editor’s note: This article was syndicated from Automated Warehouse sibling site Design World.
