top of page
To realize a symbiotic society between humans and robots

ROS System Integrator

ROS development

ROS development achievements

We introduce the demonstration experiments and the currently developing technical elements using our proprietary ROS-based self-propelled drone.

Next-Generation Outdoor Vehicle
​"Albion"

We are currently developing the next generation outdoor running machine.

Robocup @Home

We are developing a next-generation mobile manipulator with a reduced number of parts by optimizing component design and utilizing bent steel plates. It is expected to be completed around September 2025.

Robocup @Home

RoboCup @Home requires a wide range of functionalities, including not only basic capabilities for autonomous mobile robots, such as map creation and navigation, but also the ability to recognize people and objects, engage in human interaction, and perform trajectory planning for robotic arms. Additionally, system integration is essential to combine these functions and execute various tasks.
In 2022, we equipped a robot arm on the "
Cuboid" platform, and in 2024, we participated in the competition with the "SOAR" model.

In the "stickler for the rules" task, by providing ChatGPT with an image of the person being observed and a list of rules they are expected to follow, it can simultaneously determine which rules are being violated.
Example rules:

  • Whether they have removed their shoes

  • Whether they are holding a drink

  • Whether they have discarded anything on the floor

By generating an action sequence with ChatGPT based on the list of actions SOAR can perform, we enable the robot to interpret and execute non-standard commands provided through human interaction. Even for unconventional commands, the robot can understand the intent behind the instruction.

Command given: Get the milk from the coffee table
Action plan generated by ChatGPT:

  1. Move to the coffee table

  2. Detect the milk

  3. Grasp the milk

  4. Move to the location where the command was given

Mobile Manipulator Development (Soar)

 

We are developing mobile manipulators (self-propelled drones with arms) based on our experience and achievements in RoboCup2022 and WRS. After 10 years of hard work, we are finally catching up with Fetch.
The intended task is shopping at convenience stores.

This is a pick & place video of cup noodles using image recognition.

Development Scene Video

 

For the robot to make purchases at a convenience store, it must perform the following flow :

  1. User: Specifies the order via speech (voice recognition)

  2. Robot: Moves to the convenience store, opens the door, takes the elevator, and arrives at the convenience store

  3. Robot: Picks up the item (item recognition and pick up)

  4. Robot: Shows the item to the clerk

  5. Clerk: Proceeds with the checkout (inputs into POS, scans the PayPay QR code attached to the robot)

  6. Robot: Receives the item

  7. Robot: Moves to the user, and hands over the item upon arrival
     

Tsukuba Challenge

Every year, we participate in the "Tsukuba Challenge" held in Tsukuba City, Ibaraki Prefecture, Japan, with a self-propelled outdoor drone developed by the Chief Scientist Office.

The image was taken at 4x speed from start to finish on the map.
The map was created by projecting the color information of the fisheye camera onto a 3D Lidar (velodyne) point cloud.

This video is a close-up shot.

Elevator patrol by "Fetch"

 

"Fetch", an autonomous robot with arms from Fetch robotics, operates elevator buttons and patrols a building. It also performs human detection using YOLO, an image recognition algorithm based on deep learning.

3D SLAM using R3 Live

We installed a global shutter camera and Livox Lidar Horizon on an electric wheelchair and created a 3D map using R3 LIVE  from the acquired data.

Emotional expression based on "rostopic" content

 

The robot will express emotions based on the content of "rostopic".
Smooth movement is achieved by using SVG and web animation libraries.

Development of cliff detection function using 2D Lidar

 

This is a demonstration video of cliff detection near the stairs using the ROS-equipped robot "Cuboid."

The floor condition 1m away is measured using 2D Lidar, which faces 45 ° downward from the robot's head, and the cliff is recognized as an obstacle on the local cost map for the robot to avoid it.

Data sharing between delivery robots and AR navigation using spatial ID
(Digital Agency demonstration)

 

As part of the "Research and Research on Digital Twin Construction" commissioned by the Digital Agency, a demonstration experiment on data sharing between delivery robots and AR navigation using spatial ID was conducted in February 2023 at Tokyo Port City Takeshiba in Minato-ku Tokyo.

Delivering luggage in cooperation with elevators
(NEDO demonstration)

 

Conducted a demonstration experiment of indoor delivery using an elevator linkage system with an automated robot as a project implementer of the New Energy and Industrial Technology Development Organization (NEDO), a national research and development corporation.

Remote monitoring Web system using ROS-equipped robot

 

Video distribution and rostopic transmission/reception are performed using WebRTC. This function enables remote monitoring and simple operation of the robot.

World Robot Summit 2020
Future Convenience Store Challenge

 

In 2020, we participated in the World Robot Challenge (WRC) , an event sponsored by the Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO) as part of the World Robot Summit (WRS). Our team achieved third place in the "Future Convenience Store Challenge," which focused on the coexistence of humans and robots in a convenience store environment by stocking and disposing of products.

Utilizing Cuboid with Signage Module at STATION Ai

 

In October 2024, the autonomous mobile robot "Cuboid" was introduced to the open innovation hub "STATION Ai," which opened in Tsurumai, Nagoya. This robot moves between the member zone and the general zone, allowing all visitors to STATION Ai to interact with it.

Cube petit

 

"Cube petit" is an autonomous robot that can be integrated into people's lives. It is being developed as a low-cost, compact robot kit to be used widely throughout the world.

  • Offline Japanese and English speech synthesis and recognition

  • Uses OpenJTalk for Japanese speech synthesis and Julius for recognition, enabling speech with various speeds and emotions

​ Charging dock

We have developed a charging dock equipped with "a charging connector that enables rapid charging" and "a ROS package that aligns toward the AR marker."

Providing self-propelled drones with RaaS
3 months free trial

* 3 months free trial Terms of use :

Limited to use in places where people come and go (eg halls, public spaces, etc.).

bottom of page