Seoul Robotics makes robots. From the name that’s obvious, but what the company actually does is far more intriguing – it makes robots more intelligent. In fact, it’s one of the company’s main areas of focus. With its 3D Computer Vision software platform that processes LiDAR sensors data, Seoul Robotics’ technology helps robots see and interpret the world around them better, making it invaluable to the autonomous mobility space.
Recently, the company has had several major developments. So we caught up with CEO and Co-Founder, HanBin Lee, to discover more.
Lee met his fellow Seoul Robotics co-founders – Oran Kwon, Jaeil Park, and Hong Truong – in 2017 while taking online machine learning classes through the U.S. educational organisation Udacity.
“We teamed up for a competition to develop software for self-driving cars. The competition challenged teams to leverage sensor data to identify objects and keep cars on the road. People were using cameras and radar, but nobody was using just lidar,” he explains.
“We saw a vacuum in the lidar space – at the time there were only a couple research papers out there on applying deep learning in lidar – so we decided to rely on lidar to see how far we could go. We ended up coming in tenth place out of over 2000 teams – it was clear we were onto something.”
“We founded Seoul Robotics in 2017 to create a sensor-agnostic 3D perception software for the masses, to make robots more intelligent and disrupt the future of mobility,” says Lee, taking us back to the time he started the company.
“We see our product as the bridge between hardware companies producing sensors and the
industries looking to integrate 3D data insights into their operations. While 3D sensing is most often associated with self-driving cars, with the cost of 3D sensors coming down, the
technology is now seeing wider adoption (see: lidar in the latest iPhone).”
“Data from 3D sensors can deliver another dimension of insight for numerous industries, such as retail, manufacturing, and road safety. Before our software came to market, there was no easy way for these companies to easily capture data insights from sensors. Now companies in numerous industries can pair 3D sensors with our software for a variety of functions, from automated logistics to understanding in-store traffic patterns,” he adds.
Giving Robots a Helping Hand
Over the past three years, Seoul Robotics has gained prominence in the Asian and European markets, developing partnerships with a range of companies and organizations, including BMW, Mercedes Benz, the Korean government, and U.S. Departments of Transportation.
Coming back to the part about making robots intelligent, I was curious to know more about
how Seoul Robotics is out to achieve this and how its solutions for Lidar technology are going to fuel driverless cars and smart cities.
Lee explains: “Seoul Robotics makes robots intelligent through our industry-leading perception software platform, SENSR. Our patented 3D computer vision system is powered by machine learning that allows for live detection, tracking, and classification of objects. SENSR’s architecture is open and extremely accurate, making it a viable solution for numerous lidar applications, like V2X communication, traffic safety and retail analytics.
“The platform is sensor agnostic, compatible with over 70 different types and brands of 3D
sensors, including lidar sensors, 3D cameras, and imaging radar. Traditionally, 3D perception software is developed in-house – by companies with a lot of capital – to be used with a specific sensor, making it challenging and cost-prohibitive outside of select applications. With SENSR, we are democratizing lidar and making it easier for businesses and organizations to glean insights, opening up a traditionally siloed industry to new applications.
“Sebastian Thrun, Udacity co-founder and previous winner of the Defence Advanced
Research Projects Agency (DARPA) self-driving car challenge, said that autonomy is 90
percent a perception problem. Robots are not yet able to understand as we do. We are solving that challenge for 3D sensors, similar to how companies like Tesla are addressing that problem for 2D computer vision.”
Seoul Robotics’ AI-powered, smart 3D perception engine, which has been integrated with the company’s other product offering, including its newly launched Lidar Processing Unit (LUP), Discovery.
Lee helps me understand this better and says: “Our core technology is understanding 3D data and patterns with machine learning. Through processing volume and shape information, we are helping robots understand an object’s location, size, speed, class, and other useful information with extreme accuracy, creating a more safe and reliable autonomous system – something that is limited with 2D based perception. Our 3D perception engine is the core and base of all products.
“The engine is a cumulative technology that we have trained and crafted over the last three years utilizing the latest breakthrough algorithms in AI and innovations in the 3D sensor industry.
“Discovery is the first of its kind lidar processing unit (LUP), a computer or chip dedicated to processing lidar with AI software. At a basic level, Discovery turns 3D sensors into IoT devices. More specifically, Discovery is an all-in-one, plug-and-play solution that contains sensors with an LPU powered by our 3D perception engine software SENSR.”
“Thanks to its industry-leading combination of hardware and software, right out of the box users can implement 3D data applications with the most precise and reliable 3D perception software available today. Because of its ability to be customized, the platform has a number of different uses, from wrong-way detection on off ramps to automating assembly processes,” adds Lee.
Smarter, Safer City Solutions
Seoul Robotics is currently partnering with likes of Mercedes Benz, BMW, Qualcomm and
Israeli simulation company Cognata. According to Lee, the experience of collaborating with these industry giants has been a fulfilling one. “We are grateful to our partners and for the ability to collaborate with leading visionaries in mobility and smart city innovations around the globe.”
“We founded Seoul Robotics to enable 3D data insights for broader uses, and partners are bringing that vision to life by implementing SENSR and Discovery across a range of applications, from Mercedes Benz tracking shopper traffic in showrooms to Qualcomm improving public safety with new smart city solutions.”
Lee says its partnership with Cognata will specifically help to create more robust and reliable lidar perception solutions in the AV and smart city space by simulating use cases that cannot be tested in the real world.
We see the future of mobility as autonomy through infrastructure.
While on the subject of partnerships and collaborations, Seoul Robotics recently announced
a partnership with Center for Urban Informatics and Progress (CUIP) and Chattanooga
Department of Transportation (CDOT).
On asking Lee the role that the company plays in this collaboration, he said: “We are excited about our current deployment of SENSR to improve public pedestrian safety in Chattanooga. In collaboration with Ouster, we are working to use our 3D perception software with lidar to help the CUIP and CDOT understand pedestrian patterns in high-traffic areas and how to mitigate risk factors.”
“Equipped with SENSR, these smart sensors are installed in highly-trafficked pedestrian areas and work to create a highly accurate and detailed understanding of how vehicles and pedestrians interact.”
In 2019, Seoul Robotics raised $5 million for expansion into North America with the goal of fostering new partnerships in the mobility space. Since then, it has grown its U.S.-based partnerships to include multiple state Departments of Transportation and educational organizations, including the University of Michigan’s Mcity, and seem excited for the opportunity to continue fostering new, impactful partnerships.
This funding also helped enable the company to continue updating its software offerings, giving it the potential to learn and process more information and further expand its applications.
Expanding on the company’s future plans, as well as his outlook on autonomous mobility and smart cities, Lee says: “While lidar is best known thanks to autonomous vehicles, we see the future of mobility as autonomy through infrastructure: the creation of connected ecosystems rather than sensors on vehicles themselves to develop and deploy mobility solutions.”
“Sensors can be placed arounds vehicles – like on stoplights, warehouses, and highway overhangs – and then work in tandem with Seoul Robotics’ software and the current 4/5G technology of connected machines through partners like Qualcomm. This drastically reduces the number of sensors necessary for autonomous mobility, lowering the price for companies (and eventually, people) to access autonomous solutions.”
Lee sees collaborations – such as its work with the city of Chattanooga – as the future of smart cities.
“These projects are valuable for demonstrating to other cities how insights from 3D data can fuel a more efficient and safer society. Building autonomy through infrastructure isn’t just an idea or something that is coming in the future, we’re making it a reality today,” concludes Lee.