Impact Tough Robotics Challenge Invited Talk Wednesday 4 Nov, 11h00 (UTC +4)
Prof. Satoshi Tadokoro
Director, Tough Cyberphysial AI Research Center, Tohoku University
Professor, Graduate School of Information Sciences, Tohoku University
Current time in Tokyo Wed , Mar 3 , 2021
How Robots are Helping with COVID-19 and How They Can Do More in the Future Invited Talk Wednesday 4 Nov, 17h00 (UTC +4)
Prof. Robin Murphy
Professor of Computer Science and Engineering at Texas A&M University
Current time in New York Tue , Mar 2 , 2021
Robots that see and navigate outdoors Invited Talk Thursday 5 Nov, 11h00 (UTC +4)
Prof. Peter Corke
Distinguished Professor, Queensland University of Technology
ARC Centre of Excellence for Robotic Vision
Current time in Sydney Tue , Mar 2 , 2021
Desert Locust Control Management: Innovative Technology on the Front Lines in the Battle Against locust Invited Talk Thursday 5 Nov, 17h00 (UTC +4)
Dr. Mamoon AlSarai Alalawi
Executive Secretary, FAO Commission for Controlling the Desert Locust in the Central Region (CRC)
Food and Agriculture Organization of the United Nations (FAO)
Cairo - Egypt
Current time in Cairo Tue , Mar 2 , 2021
Mobile robot olfaction: Towards search and rescue robot dogs Invited Talk Friday 6 Nov, 11h00 (UTC +4)
Prof. Lino Marques
Institute of Systems and Robotics, University of Coimbra, PT
Current time in porto Tue , Mar 2 , 2021
Agile and Resilient Autonomous Flight Invited Talk Friday 6 Nov, 17h00 (UTC +4)
Prof. Giuseppe Loianno
New York University, USA
Current time in New York Tue , Mar 2 , 2021
Abstract

ImPACT Tough Robotics Challenge is a national project of Japan Cabinet Office that focuses on tough technologies of robotics to give solutions to disaster response, recovery and preparedness. Natural and man-made disasters are a serious problem for our society. Robotics is widely recognized as potentially effective countermeasure to solve this problem. It has, however, limited capability due to the fragility of this technology. Many robot technologies can work only under well-prepared conditions, which do not occur in disaster sites. This project aims at making such technologies ‘tougher’ by removing constraints.

It consists of subprojects of six types of robot platforms and several component technologies integrated with the robots: Cyber Rescue Canine, digitally empowered rescue dogs, serpentine robots for search in debris, serpentine robots for plant/infrastructure inspection, UAVs for gathering information in wide area of disaster area, legged robots for plant/infrastructure inspection in risky places, construction robots for recovery tasks that require both high-power and preciseness.

This key speech introduces some of the major research outcomes of this project.


Short Bio

Satoshi Tadokoro graduated from the University of Tokyo in 1984. He was an associate professor in Kobe University in 1993-2005, and has been a Professor of Tohoku University since 2005. He was a Vice/Deputy Dean of Graduate School of Information Sciences in 2012-14, and is the Director of Tough Cyberphysical AI Research Center since 2019 in Tohoku University. He has been the President of International Rescue System Institute since 2002, and was the President of IEEE Robotics and Automation Society in 2016-17. He served as a program manager of MEXT DDT Project on rescue robotics in 2002-07, and was a project manager of Japan Cabinet Office ImPACT Tough Robotics Challenge Project on disaster robotics in 2014-19 having 62 international PIs and 300 researchers that created Cyber Rescue Canine, Dragon Firefighter, etc.

His research team in Tohoku University has developed various rescue robots, two of which called Quince and Active Scope Camera are widely-recognized for their contribution to disaster response including missions in the Fukushima- Daiichi NPP nuclear reactor buildings.

Abstract

This talk will describe how ground, aerial, and marine robots have been used to protect healthcare workers from unnecessary exposure, handle the surge in demand for clinical care, prevent infections, restore economic activity, and maintain individual quality of life during the first nine months of the COVID-19 pandemic. The talk is based on an analysis of the publicly available Robotics For Infectious Diseases (R4ID) dataset of over 200 instances capturing how robots have been used in 33 countries. The uses span six sociotechnical work domains and 30 different use cases representing different missions, robot work envelopes, and human-robot interaction dyads. The dataset also provides a model of adoption of robotics technology for disasters. Adoption favors robots that maximize the suitability for established use cases while minimizing risk of malfunction, hidden workload costs, or unintended consequences as measured by the NASA Technical Readiness Assessment metrics. Regulations do not present a major barrier but availability, either in terms of inventory or prohibitively high costs, does. The model suggests that in order to be prepared for future events, roboticists should partner with responders now, investigate rapidly manufacturing complex, reliable robots, and conduct fundamental research on predicting and mitigating risk in extreme or novel environments.


Short Bio

The author is the Raytheon Professor of Computer Science and Engineering at Texas A&M University, a TED speaker, and an IEEE and ACM Fellow citing her work in creating the fields of disaster robotics and human-robot interaction. She has deployed robots to 29 disasters in five countries including the 9/11 World Trade Center, Fukushima, the Syrian boat refugee crisis, Hurricane Harvey, and the Kilauea volcanic eruption. Murphy’s contributions have been recognized with the ACM Eugene L. Lawler Award for Humanitarian Contributions. Murphy is co-founder of the IEEE Safety Security Rescue Robotics technical committee and annual symposium. She co-chaired the 2015 White House OSTP and NSF workshops on robotics for infectious diseases and a current member of the 2020 National Academy of Science/Computing Community Consortium task force, see more at roboticsForInfectiousDiseases.org.

Abstract

Visual perception is a critical competence for robots, particularly for those that are mobile and operate in the field. This talk will define and motivate the problem of robotic vision, the challenges as well as recent progress at the Australian Centre for Robotic Vision. This includes component technologies such as novel cameras, deep learning for computer vision, transfer learning for manipulation, evaluation methodologies, and also end-to-end systems for applications such as logistics, agriculture, environmental remediation and asset inspection.


Short Bio

Peter is a robotics researcher and educator. He is the distinguished professor of robotic vision at Queensland University of Technology, director of the ARC Centre of Excellence for Robotic Vision and Chief Scientist of Dorabot. His research is concerned with enabling robots to see, and the application of robots to mining, agriculture and environmental monitoring. He created widely used open-source software for teaching and research, wrote the best selling textbook “Robotics, Vision, and Control”, created several MOOCs and the Robot Academy, and has won national and international recognition for teaching including 2017 Australian University Teacher of the Year. He is a fellow of the IEEE, the Australian Academy of Technology and Engineering, the Australian Academy of Science; former editor-in-chief of the IEEE Robotics & Automation magazine; founding editor of the Journal of Field Robotics; founding multi-media editor and executive editorial board member of the International Journal of Robotics Research; member of the editorial advisory board of the Springer Tracts on Advanced Robotics series; recipient of the Qantas/Rolls-Royce and Australian Engineering Excellence awards; and has held visiting positions at Oxford, University of Illinois, Carnegie-Mellon University and University of Pennsylvania. He received his undergraduate and masters degrees in electrical engineering and PhD from the University of Melbourne.

Abstract

The Desert Locust Schistocerca gregaria (Forsskal) is one of the most devastating pests in agriculture. Locust swarms can fly for hundreds of kilometres between different breeding areas and rapidly move from one country or region to another. If infestations are not detected early, massive plagues can develop that often take several years and hundreds of millions of dollars to bring under control. Combating these fast-moving locust swarms requires special technical, managerial and logistical skills. National, regional and international capacities are required to coordinate locust control operations effectively and to minimize the risks to food security and subsequent economic and social damage. For over 65 years, FAO has been helping at-risk countries win the fight against locust infestations. FAO has developed and used many modern tools such as eLocust3, Track Guidance System, Ultra Low Volume and drone technologies that allow locust management, field survey and control teams to map the movements of locusts across countries and starting control operation to bring locust under control before it does severe damage


Short Bio

The author is a PhD holder. 30 years working experience in pest management (Dubas Bug, Red Palm Weevil, Date Palm pathogens etc), Pesticides management, Desert Locust Management, Spray technique (aerial and ground). 21 years working in the Ministry of Agriculture and Fisheries, Oman. Director of Plant Protection Department. Since November 2011 working with FAO as Executive Secretary of the Commission for Controlling Desert Locust in the Central Region (CRC) based in Egypt.

Abstract

Smell is a valuable sense for search and rescue operations. These are frequently carried out by human brigades supported by highly skilled dogs which are trained to search, detect, and track the scent of victims or lost persons. Robots have been used in the past twenty years or so in multiple disaster scenarios, namely as a tool used for visual inspection of dangerous places or hard to reach locations formed by the collapse of structures. Although significant progresses have been made in robotics locomotion and visual sensing and perception, there is still a large gap in terms of olfaction, so robot dogs can be used in search and rescue operations, effectively replacing their animal counterparts.

This talk surveys and analyses the field of mobile robot olfaction, from sensing to decision making algorithms for olfactory guided search and identifies the most significant bottlenecks to build effective robot dogs that use smell to find targets.


Short Bio

Lino Marques is an Associate Professor at the University of Coimbra and a Senior Researcher at the Institute of Systems and Robotics, where he heads the Field Robotics group. His research activities focus on mobile robot olfaction, multi-robot systems, robotics for hazardous environments and field and service robotics. In the past years he has been involved in several national and European research projects, having been the local coordinator of FP7 TIRAMISU researching methods for autonomous robotic detection of landmines and FP6 GUARDIANS researching swarm robotics approaches to support firefighting brigades. He has been involved in the organization of multiple scientific conferences and workshops, including a series of workshops on Robotics for Environmental Monitoring, held at ICRA, IROS and RSS conferences. He is editor in chief for Robots and Multi-Robot Systems of the International Journal of Advanced Robotic Systems, from SAGE and has been guest editor for Special Issues on other robotics journals, such as IEEE Robotics & Automation Magazine, Autonomous Robots, Robotics and Autonomous Systems. He chairs the IEEE Robotics and Automation Society, Special Interest Group on Humanitarian Technology (RAS-SIGHT) and vice-chairs the Portuguese Robotics Society.

Abstract

Flying robots, often called drones, are starting to play a major role in several tasks such as search and rescue, interaction with the environment, inspection, patrolling and monitoring. Agile and resilient navigation of Micro Aerial Vehicles (MAVs) through unknown environments poses a number of challenges in terms of perception, state estimation, planning, and control. To achieve this, MAVs have to localize themselves and coordinate between each other in unstructured environments. In this talk, I will present some recent research results agile and resilient navigation of aerial robots for search and rescue, exploration, transportation, physical environment interaction, and human drone collaboration using a minimal on-board sensor suite composed by an IMU and camera sensors information.


Short Bio

The author is an assistant professor at the New York University and director of the Agile Robotics and Perception Lab (https://wp.nyu.edu/arpl/) working on autonomous Micro Aerial Vehicles. Prior to NYU he was a lecturer, research scientist, and team leader at the General Robotics, Automation, Sensing and Perception (GRASP) Laboratory at the University of Pennsylvania. He received his BSc and MSc degrees in automation engineering, both with honors, from the University of Naples "Federico II" in December 2007 and February 2010, respectively. He received his PhD in computer and control engineering focusing in robotics in May 2014. Dr. Loianno has published more than 70 conference papers, journal papers, and book chapters. His research interests include visual odometry, sensor fusion, and visual servoing for micro aerial vehicles. He received the Conference Editorial Board Best Reviewer Award at ICRA 2016, National Italian American Foundation (NIAF) Young Investigator Award 2018. He was the program chair for IEEE SSRR 2018. He has organized multiple workshops on Micro Aerial Vehicles during IROS conferences and created the new International Symposium on Aerial Robotics (ISAR). His work has been featured in a large number of renowned international news and magazines.