The network edge is the focus of great swathes of the industry today, and the more we look at it the wider the variety of approaches we see. Different applications require different levels of proximity to the users, and the technology to make them happen isn’t always a miniaturization of the hyperscale world. One company taking on the edge as far out as the equipment closet or the remote worker office is Veea. With us today to talk about Veea’s approach to the ‘Device Edge’ is Kurt Michel, SVP of Marketing at Veea.
TR: What are the origins of Veea? What problem were you looking to solve?
KM: Veea was started in 2014 by our CEO Allen Salmasi. Allen was the president at Qualcomm during the 90s when CDMA was being developed. He started Veea based on the idea that the edge of the network was going to require something different from a core data centers. He thought there was a better way to combine connectivity and computation that would fit into that market. From that we have evolved something called the smart edge node, which combines edge servers, connectivity, IoT gateways, WiFi, LoRaWAN, and a variety of different wireless protocols. They can converge a lot of different equipment and can be deployed in the same places where WiFi access points and routers are deployed today – locations that people are already familiar with. They don’t require racks or cooling or special power and equipment closets. What they do that a micro data center might not use a wireless or wired mesh protocol to connect the devices.
TR: What does that meshing empower?
KM: The computational resource that’s available in these smart edge nodes has access throughout the mesh via a virtual bus capability. So an IoT devices that is connected to any smart edge node on the mesh has the ability to communicate with any IoT device anywhere else on the mesh. Then all of that can be backhauled to a central edge network management facility. So you can build, deploy, and expand edge networks simply and wirelessly.
TR: Where are these smart edge devices intended to be deployed?
KM: They can be at the customer premises. We have solutions that work in outdoor smart city environments. And we’re seeing deployments for smart building and industrial IOT applications. We recently did a case study on managing ventilation systems in underground tunnels, such as train tunnels, in which the existing wired infrastructure was replaced with something that was far more easily deployed, managed, and controlled and which and removed a lot of the issues that come from wired deployments. You could also deploy these for smart agriculture, using LoRaWAN to pick up all of your moisture sensors in the fields or using WiFi to collect video feeds from your livestock and do some levels of video processing on that to capture what you think are worthy of sending back up to the cloud for deeper analysis.
We work with other cloud-based IOT platforms and we are certified for Azure IOT, with integration in our software that makes it easy for you to communicate back to the Microsoft Azure platform. We are working to accelerate the ability for third party software providers to build on our platform. We use a four docker container architecture so that multiple containerized applications from can run on the platform simultaneously, and we can centrally deploy those applications from the cloud.
TR: How do you view the edge in today’s hyperscale-focused infrastructure? Where in the ecosystem does Veea’s technology fit?
KM: The idea of the hyperscale data center is fantastic. It was built around the idea of hyperscale and makes sense for aggregating all the traffic and doing deep inspections. But there are a variety of tasks that are emerging that require the low latency: augmented reality, virtual reality, and anything that requires some kind of human interaction as close as possible to the person. The best way to do that may not always be to just take the hyperscale data center idea and compress it down of the box to put at the base of a cell tower. There are a variety of places in retail environments, in smart buildings, in smart streets, where you can deploy solutions that can do the processing, curation and filtering that you need. The edge does not always need data centers. There is a hierarchical approach that can be taken, and placing compute resources as close as possible to the endpoints is the best if you want to reduce bandwidth on the network and if you want to be able to respond as quickly as possible to the user.
This model doesn’t change the need for micro data centers at the edge, rather I would say it’s an adjunct to them. I call it the device edge, whereas you might think of the micro data center as the access edge or the aggregation edge. I formerly worked at Akamai, which was all about the edge but at the ISP edge, which is not as close as the device edge. If you’re looking for low latency response, if you’re looking to minimize the amount of bandwidth needed for backhaul by processing data right there, that’s almost the ultimate way to handle it.
TR: What does your ecosystem look like? How do you reach your customers and how does your gear get deployed?
KM: We work with integrators. In the case of the ventilation system that I talked about, we worked with Trionix. We have an agreement with SK Engineering and Construction, the large Korean conglomerate. They are using our platform and our tools as part of an overall safety management system because of its deployment flexibility and ease of scale. They are setting up cameras, detectors and various warning systems, and aggregating them all into the edge platform that they’re deploying for use on their construction sites. And even further, they’re thinking about productizing that and making it available beyond just SK.
TR: How has COVID and the rise of the remote workforce affected Veea this year? Has it driven new demand?
KM: There is certainly a lot more interest in remote capabilities. We make a variety of different devices and one of them is VeeaHub which is actually quite good at simple deployments like a home office environment. You can just plug it in and create a secure connection into the enterprise that is actually manageable from the enterprise. IT departments have really struggled as remote work has grown due to bottlenecks from VPN capabilities and gateways. A solution that’s more peer-to-peer based but centrally managed is of great interest. So we have gotten a great deal of interest in that solution.
TR: Do you think that extra demand will remain as the pandemic itself wanes?
KM: I do. I think there’s a significant shift in the amount of people who are going to be remote workforce. In my opinion, what the pandemic has done is force people to embrace remote work and to become trusting in people doing remote work. It has forced us all five years ahead of where we would have been. Anecdotally, I know of a variety of companies that are planning to reduce their office space and allow employees to work remotely for the long-term while saving a lot of money. In other places, I’m hearing about improvement in the efficiency and productivity of workers when they don’t have to commute. I think that IT departments are recognizing that remote work is here to stay and are looking at the infrastructure that they need to manage it. Once they deploy an edge node in these locations, they also have the opportunity to deploy enterprise applications the same the way they would have in the building in a distributed way across these edge nodes.
TR: What challenges remain to be solved to make this happen?
KM: Cyber attacks have been very specifically targeted at remote workers and the vulnerabilities that are there. Enterprises have had to accept the fact that there’s a new level of risk as you distribute your workforce out. But I think they’re now looking at how to patch the holes, how to do all this right. How do we design an infrastructure that is truly distributed yet secure.
TR: Where does Veea have a presence, and what global markets do you focus the most on today?
KM: We are a New York City company and we have a development location in New Jersey, Iceland, and also in Bath, UK. We are not geographically focused, and we have sales offices around the world. We are doing quite a bit right now in South Korea, and then from there we are branching out into the wider Asia-Pacific opportunity. We are doing things with the Cote d’Azur University in France with the IMREDD consortium on smart city applications in that region. We are seeing demand in property management and advertising in North America. And the tunnel application that I described is in Latin America.
TR: What is ahead of Veea on the technology roadmap?
KM: First, we’re looking at moving towards a 5G device, which will be a very small, elegant, integrated package that will work well in a variety of remote worker and small office environments. It will have a modular approach that allows the simple addition of different features and capabilities in highly scalable way. It will change the way we scale from simple adding nodes horizontally across the mesh by scaling performance and capabilities vertically within a specific node as well.
Second, we will be focusing more on the orchestration of the applications within the mesh. Right now, as you deploy your applications you can move them amongst the nodes, add more performance, and distribute them across the mesh. What we’d like to see is to have that more easily automated and elastic based on demand. And we’re continuously evolving our mesh technology to make it completely resilient, perfectly elastic and more extensible.
Then we have something called the IoT toolkit, which integrates software drivers and other thing which we continuously expand and simplify. And we are going to add additional cloud platforms and IoT platforms beyond what we have today. That will evolve organically as the applications come in.
And another area that we’re really focused on is computer vision and being able to assist computer vision at the Edge, being able to parse out the portions of the computer vision hierarchy. Let’s say you’re trying to figure out if somebody is wearing a mask. The first level is whether there is somebody in the image. You don’t want to take all that video and backhaul to the cloud when there’s nothing going on. You need some level of detection as close as possible to the edge that says, “There’s a person here.” At another level of detection might ask, “Is this the person wearing a mask?” A third level of detection might say, “If the person is not wearing a mask, who is that person?”, which is the kind of keep facial recognition analysis that should be moved toward the cloud. You might call it a smart filter, or a curation capability.
TR: Thank you for talking with Telecom Ramblings!
If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!Categories: Datacenter · Industry Spotlight · Managed Services · Wireless
Discuss this Post