Home
Search results “Indoor navigation strategies for aerial autonomous systems”
Autonomous Vision-Based Navigation System for Unmanned Aerial Vehicles (UAVs)
 
19:41
Autonomous Navigation of Unmanned Aerial Vehicles is an important area of research in artificial intelligence. Intelligence in an autonomous vehicle must include strategies for mobility in order to achieve higher lever functional tasks. This project aims to enable a low-cost Quadcopter equipped with the on-board cameras/sensors, and coupled wirelessly with a ground-based laptop to navigate autonomously in previously unknown and GPS-denied (indoor-outdoor) environments based on computer vision
Views: 1234 LSI Carlos Tercero
Thermal-Inertial Localization for Autonomous Navigation of Aerial Robots through Obscurants
 
01:49
In this video, the problem of GPS-denied aerial robot navigation through obscurants is considered. Through the fusion of thermal camera and IMU data, estimation of the robot trajectory in such degraded visual environments is achieved. The system is demonstrated in a heavily smoke-filled machine shop environment. Enabling the navigation of aerial robots through obscurants can prove critical in important applications such as search and rescue in underground mines, or in many real-life surveillance scenarios. Find out more: www.autonomousrobotslab.com
Views: 1633 Kostas Alexis
Experiments in Fast, Autonomous, GPS-Denied Quadrotor Flight
 
03:01
In work, we describe our quadrotor system which is able to smoothly navigate through mixed indoor and outdoor environments and is able to fly at speeds of more than 18 m/s using vision based state estimates. We demonstrate the robustness of our system through high speed autonomous flights and navigation through a variety of obstacle rich environments. To appear in ICRA 2018.
Views: 2098 Vijay Kumar
Autonomous Vision Based Navigation System for Unmanned Aerial Vehicles UAVs
 
19:41
Autonomous Navigation of Unmanned Aerial Vehicles is an important area of research in artificial intelligence. Intelligence in an autonomous vehicle must include strategies for mobility in order to achieve higher lever functional tasks. This project aims to enable a low-cost Quadcopter equipped with the on-board cameras/sensors, and coupled wirelessly with a ground-based laptop to navigate autonomously in previously unknown and GPS-denied (indoor-outdoor) environments based on computer vision
Views: 210 Abdulla Al-Kaff
Autonomous UAV for Interior Surveys of Hazardous Spaces
 
03:17
Vision-based teleoperated flight in cluttered environments with collision avoidance, simultaneous altitude and terrain elevation estimation using downward rangefinders, and efficient perceptual modeling of indoor scenes using Hierarchical Gaussian Mixture Models from the Resilient Intelligent Systems Lab (Director: Prof. Nathan Michael) at Carnegie Mellon University.
Visual Saliency-aware Receding Horizon Autonomous Exploration with Application to Aerial Robotics
 
01:31
In this paper we present autonomous visual saliency-aware receding horizon exploration using aerial robots. Through a model of visual attention, incrementally built maps are annotated regarding the visual saliency of different objects and entities in the environment. Provided this information, a path planning strategy that simultaneously optimizes for exploring unknown space and directing the robot's attention to better observe salient regions is developed. Following a 2-step optimization paradigm, the algorithm first samples a random tree and identifies the branch that maximizes the expected exploration gain. The first viewpoint of this path is selected but the path towards it is selected through a second planning step that optimizes for the reobservation of salient regions at sufficient resolutions. The robot follows this reference and the whole process is iteratively repeated in a receding horizon fashion. The presented results refer to the exploration of two environments with different salient objects, namely a room with paintings and a mannequin, as well as a machine shop that includes salient objects such as warning signs and a fire extinguisher.
Views: 460 Kostas Alexis
Autonomous UAV Navigation with Domain Adaptation(AUNDA)
 
05:07
More info : https://arxiv.org/abs/1712.03742 Updated version for RC car : https://youtu.be/BQXtX14e-0M
Views: 178 유재윤
Low Cost Indoor Robot Navigation using IR | TI ARM CORTEX
 
03:44
Controller: TM4C123GXL Communication between PC and Robot: HC 05 Bluetooth Coordinate Transmission: IR Lcd: EA DOGM128 see more at http://www.ramakula.com/joey/
Views: 536 Ram Akula
RI Seminar: Kostas Alexis : Autonomous Exploration and Inspection using Aerial Robots
 
56:32
Kostas Alexis Assistant Professor, University of Nevada, Reno October 28, 2016 Autonomous Exploration and Inspection using Aerial Robots Abstract The capacity of aerial robots to autonomously explore, inspect and map their environment is key to many applications. This talk will overview and discuss a set of new -and in their majority open sourced and experimentally verified- sampling-based strategies that break new ground on how a robot can efficiently inspect a structure for which a prior model exists, how to explore unknown environments, and how to actively combine the planning and perception loops to achieve autonomous exploration with maintained levels of 3D mapping fidelity. In particular, we will detail recent developments in the field of active perception and belief-space planning for autonomous exploration. Finally, an overview of further research activities on aerial robotics, including solar-powered unmanned aerial vehicles and aerial manipulators will be provided. Speaker Biography Kostas Alexis obtained his Ph.D. in the field of aerial robotics control and collaboration from the University of Patras, Greece in 2011. His Ph.D. research was supported by the Greek national-European Commission Excellence scholarship. After successfully defending his Ph.D. thesis, he was a awarded a Swiss Government fellowship and moved to Switzerland and ETH Zurich. From 2011 to June 2015 he held the position of senior researcher at the Autonomous Systems Lab, ETH Zurich, leading the lab efforts in the fields of control and path planning for advanced navigational and operational autonomy. His research interests lie in the fields of control, navigation, optimization and path-planning focusing on aerial robotic systems with multiple and hybrid configurations. He is the author or co-author of more than 50 scientific publications and has received several best paper awards and distinctions, including the IET Control Theory & Applications Premium Award 2014. Furthermore, together with his collaborators, they have achieved world records in the field of solar-powered flight endurance. Kostas Alexis has participated in and organized several large-scale multi-million dollar research projects with broad international involvement and collaboration. In July 2015, Kostas moved to the University of Nevada, Reno with the goal to dedicate his efforts towards establishing true autonomy for aerial and other kinds of robotics.
Views: 2134 cmurobotics
RIFT - Autonomous Navigation for UAV's
 
01:22
High-Speed Autonomous Road Identification Following and Tracking (RIFT) algorithm, developed by UAV Lab team, Aerospace Dept, Indian Institute of Science, Bengaluru. RIFT: ~ Identifies road patches using Computer Vision ~ Identifies junctions using Artificial Neural Networks ~ Manoeuvres drone to follow roads and take turns P.S. - We do not own copyrights for the audio track.
Hexacopter position hold, by Johan Fogelberg, Lund University & Combine Control Systems AB
 
01:14
Navigation and Autonomous Control of a Hexacopter in Indoor Environments This thesis presents methods for estimation and autonomous control of a hexacopter which is an unmanned aerial vehicle with six rotors. The hexacopter used is a ArduCopter 3DR Hexa B and the work follows a model-based approach using Matlab Simulink, running the model on a PandaBoard ES after automatic code generation. The main challenge will be to investigate how data from an Internal Measurement Unit can be used to aid an already implemented computer vision algorithm in a GPS-denied environment. First a physical representation is created by Newton-Euler formalism to be used as a base when developing algorithms for estimation and control. To estimate the position and velocity of the hexacopter, an unscented Kalman filter is implemented for sensor fusion. Sensor fusion is the combining of data from different sensors to receive better results than if the sensors would have been used individually. Control strategies for vertical and horizontal movement are developed using cascaded PID control. These high level controllers feed the ArduCopter with setpoints for low level control of angular orientation and throttle. To test the algorithms in a safe way a simulation model is used where the real system is replaced by blocks containing a mix of differential equations and transfer functions from system identification. When a satisfying behavior in simulation is achieved, tests on the real system are performed. The result of the improvements made on estimation and control is a more stable flight performance with less drift in both simulation and on the real system. The hexacopter can now hold position for over a minute with low drift. Air turbulence, sensor and computer vision imperfections as well as the absence of a hard realtime system degrades the position estimation and causes drift if movement speed is anything but very slow.
Active Autonomous Aerial Exploration for Ground Robot Path Planning
 
02:03
We address the problem of planning a path for a ground robot through unknown terrain, using observations from a flying robot. In search and rescue missions, which are our target scenarios, the time from arrival at the disaster site to the delivery of aid is critically important. Previous works required exhaustive exploration before path planning, which is time-consuming but eventually leads to an optimal path for the ground robot. Instead, we propose active exploration of the environment, where the flying robot chooses regions to map in a way that optimizes the overall response time of the system, which is the combined time for the air and ground robots to execute their missions. In our approach, we estimate terrain classes throughout our terrain map, and we also add elevation information in areas where the active exploration algorithm has chosen to perform 3D reconstruction. This terrain information is used to estimate feasible and efficient paths for the ground robot. By exploring the environment actively, we achieve superior response times compared to both exhaustive and greedy exploration strategies. We demonstrate the performance and capabilities of the proposed system in simulated and real-world outdoor experiments. To the best of our knowledge, this is the first work to address ground robot path planning using active aerial exploration. Reference: J. Delmerico, E. Mueggler, J. Nitsch, D. Scaramuzza, "Active Autonomous Aerial Exploration for Ground Robot Path Planning", IEEE Robotics and Automation Letters (RA-L), 2016. http://rpg.ifi.uzh.ch/docs/RAL16_Delmerico.pdf Our research page on active vision: http://rpg.ifi.uzh.ch/research_active_vision.html Robotics and Perception Group, University of Zurich, 2016 http://rpg.ifi.uzh.ch/
Views: 767 ailabRPG
MIT's NanoMap enables Drone Navigation in Uncertain Environments | QPT
 
01:31
Companies like Amazon have big ideas for drones that can deliver packages right to your door. But even putting aside the policy issues, programming drones to fly through cluttered spaces like cities is difficult. Being able to avoid obstacles while traveling at high speeds is computationally complex, especially for small drones that are limited in how much they can carry onboard for real-time processing. Many existing approaches rely on intricate maps that aim to tell drones exactly where they are relative to obstacles, which isn’t particularly practical in real-world settings with unpredictable objects. If their estimated location is off by even just a small margin, they can easily crash. With that in mind, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed NanoMap, a system that allows drones to consistently fly 20 miles per hour through dense environments such as forests and warehouses. The research team says that the system could be used in fields ranging from search and rescue and defense to package delivery and entertainment. It can also be applied to self-driving cars and other forms of autonomous navigation. Photo: Jonathan How/MIT Video Source: MITCSAIL News Source: http://news.mit.edu/2018/mit-csail-programming-drones-fly-face-uncertainty-0212
Autonomous Navigation in Underground Mines
 
04:43
The ARIDuA project is funded by the European Social Fund (ESF). In this video the research robot "Julius" navigates through the local research mine "Reiche Zeche". Therefore, it uses the following algorithms: Setup (ROS-indigo) http://www.ros.org/about-ros/ Sensor (KinectONE) https://www.xbox.com/en-US/xbox-one/accessories/kinect http://wiki.ros.org/libfreenect Sensor (Velodyne Puck) http://velodynelidar.com/vlp-16.html http://wiki.ros.org/velodyne Sensor (SICK LMS 511) https://www.sick.com/de/en/detection-and-ranging-solutions/2d-lidar-sensors/lms5xx/lms511-20100-pro/p/p216240 http://wiki.ros.org/LMS1xx Mapping & Localization (RTAB-Map) http://introlab.github.io/rtabmap/ http://wiki.ros.org/rtabmap Navigation & Path Planning http://wiki.ros.org/navigation http://wiki.ros.org/move_base For further information regarding our project please visit: http://aridua.tu-freiberg.de
Views: 738 Mining-RoX
3D Navigation for UAVs in GPS denied environments
 
03:43
The video is concerned with the autonomous navigation of an unmanned aerial vehicle in three dimensional space without GPS. It analyses and explains several sensor groups and their corresponding algorithms to perceive the environment in urban search and rescue missions. Furthermore a first approach for the implementation of such a navigation is presented, which can be used both in simulation and on an arbitrary UAV, due to the Robot Operating System. The data was captured within the TRADR project (http://www.tradr-project.eu/) during the TJEX 2014 (http://robohub.org/first-tradr-eu-fp7-joint-exercise-tjex/)
Views: 969 RoblabFhGe
Ngatalie Indoor Autonomous Navigation Robot
 
04:37
An implementation of ROS Navigation Stack. See other video at http://www.youtube.com/watch?v=mZrx58R4iT4 The robot is named after my mother. Sorry for the broken zoom because she runs all the time. I do not own the song. Please support the owner if you like it.
Views: 337 Britsk Nguyen
The Smart Security System by Easy Aerial
 
01:31
Easy Aerial develops and manufactures the Smart Security System, a mobile, durable & fully autonomous drone-based monitoring system for perimeter security.
Views: 559 Easy Aerial
Autonomous Navigation in Dynamic Social Environments using Multi-Policy Decision Making
 
01:39
Additional material for the IROS'2016 publication. You can find more info at https://april.eecs.umich.edu/papers/details.php?name=mehta2016iros
Views: 222 Gonzalo Ferrer
Autonomous Navigation Robot - II
 
06:00
Static Map navigation and Static Obstacle avoidance. Special thanks to e-yantra, IITB and ARM Embedded Technologies Pvt. Ltd., Project done by : Dr. D. Sheshachalam, Harish V. Mekali, Karthik G, Bharath
Views: 985 Harish Mekali
Autonomous terrain following and collision avoidance with a 400-gram UAV
 
02:09
This video shows a 400-gram unmanned aircraft flying fully autonomously using the optiPilot control strategy (using computer mouse sensors) for flight stabilization, terrain following and collision avoidance. GPS is used on top of optiPilot to provide global guidance. The video shows a flight at low altitude (around 9 meter height) in a natural outdoor environment over agricultural terrain with two copses of trees requiring efficient collision avoidance action. For more information visit http://lis.epfl.ch/microflyers and http://sensefly.com. Direct link to scientific paper: http://infoscience.epfl.ch/record/142989
Views: 19338 jzuffere
Faster, Lighter, Smarter: DARPA Gives Small Autonomous Systems a Tech Boost
 
00:50
DARPA’s Fast Lightweight Autonomy (FLA) program recently completed Phase 2 flight tests, demonstrating advanced algorithms designed to turn small air and ground systems into team members that could autonomously perform tasks dangerous for humans – such as pre-mission reconnaissance in a hostile urban setting or searching damaged structures for survivors following an earthquake. Read More : https://wp.me/p9Volu-WO
Views: 15 AthisTech
Trajectory Tracking and Formation Flight of Autonomous UAV
 
03:49
In the research, we present efficient control strategies based on onboard low-cost and commodity sensors for autonomous UAVs in GPS-denied environments. Autonomous trajectory tracking and formation control were accomplished solely relying on onboard sensing and computation.
Views: 1718 SYSU Robotics Lab
Radiation Source Localization in GPS-denied Environments using Aerial Robots
 
01:29
In this paper we present the system and methods to enable autonomous nuclear radiation source localization using aerial robots in GPS-denied environments. A Thallium-doped Cesium Iodide scintillator and a Silicon Photomultiplier are combined with custom-built electronics for counting and spectroscopic information, while visual-inertial localization enables their pose annotation without GPS or other external tracking system. Provided this capability, a strategy for radioactive source localization, as well as active source search path planning is developed. The proposed methods are motivated by the radiation counting statistics requiring longer dwell times at higher intensities, the limited endurance of small aerial robots which entails a very small amount of dwell points, as well as the varying uncertainty of position estimation that is implied by the use of visual-inertial localization. The presented results refer to the localization of a Cesium-137 source using a small aerial robot both with and without any prior knowledge of its environment
Views: 666 Kostas Alexis
Vision-Based Distributed Formation Control of Unmanned Aerial Vehicles
 
03:23
We present a novel control strategy for a team of unmanned aerial vehicles (UAVs) to autonomously achieve a desired formation using only visual feedback provided by the UAV’s onboard cameras. This effectively eliminates the need for global position measurements. The proposed pipeline is fully distributed and encompasses a collision avoidance scheme. In our approach, each UAV extracts feature points from captured images and communicates their pixel coordinates and descriptors among its neighbors. These feature points are used in our novel pose estimation algorithm, QuEst, to localize the neighboring UAVs. Compared to existing methods, QuEst has better estimation accuracy and is robust to feature point degeneracies. We demonstrate the proposed pipeline in a highfidelity simulation environment and show that UAVs can achieve a desired formation in a natural environment without any fiducial markers. Kaveh Fathian, Nicholas Gans, University of Texas at Dallas, USA. Emily Doucette, Willard Curtis, Air Force Research Laboratory, USA. Email: [email protected], [email protected] Website: https://sites.google.com/view/kavehfathian
Views: 107 Kaveh Fathian
World's first autonomous indoor drone surveillance system is here
 
02:02
Audioburst Video Aired On: Your Weekly Tech Update, 08-16-2018
Views: 20 Audioburst
Navantia Inspector CloseUp UAV
 
02:49
Drone for the inspection of ship cargo tanks with UAVs. Flying systems : - Collision avoidance, - Flight assistance, - Odometry in GPS denied areas.
Views: 141 Avansig
Vision-based Collision Avoidance in UAVs
 
00:38
Monocular Vision - Supervised learning of texture features - Unstructured natural environments like forests
Vision-based Mobile Robot's SLAM and Navigation in Crowded Environments
 
04:40
Hasegawa Lab., Tokyo Institute of Technology, Japan http://haselab.info/ English: http://haselab.info/index-e.html Related papers : Hiroshi Morioka, Yi Sangkyu, Osamu Hasegawa : "Vision-based Mobile Robot's SLAM and Navigation in Crowded Environments", IEEE/RSJ International Conference on Intelligent Robots and Systems, (IROS 2011) ABSTRACT : A vision-based mobile robot's simultaneous localization and mapping (SLAM) and navigation has been the source of countless research contributions because of rich sensory output and cost effectiveness of vision sensors. However, existing methods of vision-based SLAM and navigation are not effective for robots to be used in crowded environments such as train stations and shopping malls, because when we extract feature points from an image in crowded environments, many feature points are extracted from not only static objects but also dynamic objects such as humans. By recognizing all such feature points as landmarks, the algorithm collapses and errors occur in map building and self-localization. In this paper, we propose a SLAM and navigation method that is effective even in crowded environments by extracting robust 3D feature points from sequential vision images and odometry. By using the proposed method, we can eliminate unstable feature points extracted from dynamic objects and perform SLAM and navigation stably. We present experiments showing the utility of our approach in crowded environments, including map building and navigation.
Views: 7716 HasegawaLab
Cognitive Mapping and Navigation for Mobile Robot
 
02:07
This work is published as: Direction-driven navigation using cognitive map for mobile robots VA Shim, B Tian, M Yuan, H Tang, H Li Intelligent Robots and Systems (IROS 2014) However, you may also want to refer to RGB-D based cognitive map building and navigation B Tian, VA Shim, M Yuan, C Srinivasan, H Tang, H Li Intelligent Robots and Systems (IROS), 2013 For researchers who are more interested in computational model rather than robotics, pls refer to An Entorhinal-Hippocampal Model for Simultaneous Cognitive Map Building M Yuan, B Tian, VA Shim, H Tang, H Li - Twenty-Ninth AAAI Conference We have developed a neural cognitive robot named NECO which can perform indoor mapping and navigation in an unstructured environment. This video recorded the mapping and navigation process.
Views: 1837 B. Tian
Autonomous drone cinematographer, ISER 2018
 
02:15
Autonomous drone cinematographer: Using artistic principles to create smooth, safe, occlusion-free trajectories for aerial filming Rogerio Bonatti, Yanfu Zhang, Sanjiban Choudhury, Wenshan Wang, and Sebastian Scherer Carnegie Mellon University International Symposium on Experimental Robotics, 2018
Views: 1608 Rogério Bonatti
Stereo Vision based Navigation 4/4 (2008)
 
01:15
Stereo Vision based Navigation (2008) Only Stereo Vision is used for navigation in this demos, for both self-localization, obstacle avoidance and path planning. One can see that the robot is able to move around objects that are hard to see for a robot e.g. legs of a table, chair or the feet of a human. Self-Location: https://www.youtube.com/watch?v=gbjqrNbRg-k&list=UU2r0KZYBu0R0yRvREMCMdOQ
Views: 36 SvensRobots
Fully Autonomous Site Security & Monitoring - Percepto & Magos
 
01:26
Percepto and Magos have collaborated to offer a cutting edge, fully autonomous perimeter monitoring, protection and security system. Thanks to advanced vision technologies and autonomous capabilities, Percepto and Magos can provide facilities of any size with un-manned, 24/7 peace of mind while performing a multitude of missions. The Sparrow drone can fly in any weather condition without human intervention, and can monitor and track a potential security threat while security teams coordinate a response. Follow Us! LinkedIn: https://www.linkedin.com/company/perceptoautonomousdrones/ Twitter: https://twitter.com/perceptoDrones Facebook: https://www.facebook.com/perceptodrones About Percepto: Percepto offers advanced aerial functionalities to large scale enterprises looking to improve their security, reduce human risk and operational costs while increasing site productivity thanks to constant site inspection. The proprietary PerceptoCore™ technology propels us forward and is the foundation for our unique capabilities. The PerceptoCore™ relies on real-time machine vision and advanced AI technology bringing to life a fully autonomous drone that can perform multiple without the need for human intervention. About Magos: Magos presents a completely revolutionary solution in civilian security. With cost effective, high resolution, state of the art staring radars, Magos provides 24/7, 360º coverage of any protected site, replacing existing detection measures and giving volumetric perimeter protection at all weather and lighting conditions and with lower installation and maintenance costs which results in better detection at a better price. Magos Radars have detection ranges of up to 400m for Person and 600m for vehicle or boat, low power consumption, high range resolution which lowers false alarm rate and with 120-360 degree coverage depending on the model.
Human-UAV Interaction
 
01:47
A position controlled quadrotor UAV is exploited to test a control strategy that allows to "safely" interact with a backdriveable system. By means of this control strategy, the human is capable to move the UAV around the flying area.
Views: 117 AIRobots
Robotics / Bio-Inspired Flying Robots - Jean-Christophe Zufferey / epflpress.com - polytechpress.com
 
05:00
http://goo.gl/qsyf9 This book demonstrates how bio-inspiration can lead to fully autonomous flying robots without relaying on external aids. Most existing aerial robots fly in open skies, far from obstacles, and rely on external beacons -- mainly GPS -- to localize and navigate. However, these robots are not able to fly at low altitude or in confined environments, and yet this poses absolutely no difficulty to insects. Indeed, flying insects display efficient flight control capabilities in complex environments despite their limited weight and relatively tiny brain size.From sensor suite to control strategies, the literature on flying insects is reviewed from an engineering perspective in order to extract useful principles that are then applied to the synthesis of artificial indoor flyers. Artificial evolution is also utilized to search for alternative control systems and behaviours that match the constraints of small flying robots. Specifically, the basic sensory modalities of insects, vision, gyroscopes and airflow sense, are applied to develop navigation controllers for indoor flying robots. These robots are capable of mapping sensor information onto actuator commands in real time to maintain altitude, stabilize the course and avoid obstacles. The most prominent result of this novel approach is a 10-gram microflyer capable of fully autonomous operation in an office-sized room using fly-inspired vision, inertial and airspeed sensors.
Vision based GPS-denied Object Tracking and Following for Unmanned Aerial Vehicles - Summary, HUD
 
01:44
Images are available, for research purposes, here: * In the CVG-UPM ftp (user:publicftp , pass:usuarioftp): ftp://138.100.76.91/CVG_PUBLIC/DataSets/IBVS_ardrone2_datasets/ * In google drive: http://bit.ly/1sp3qxG We present a vision based control strategy for tracking and following objects using an Unmanned Aerial Vehicle. We have developed an image based visual servoing method that uses only a forward looking camera for tracking and following objects from a multi-rotor UAV, without any dependence on {GPS} systems. Our proposed method tracks a user specified object continuously while maintaining a fixed distance from the object and also simultaneously keeping it in the center of the image plane. The algorithm is validated using a Parrot AR Drone 2.0 in outdoor conditions while tracking and following people, occlusions and also fast moving objects; showing the robustness of the proposed systems against perturbations and illumination changes. Our experiments show that the system is able to track a great variety of objects present in suburban areas, among others: people, windows, AC machines, cars and plants. http://robotics.asu.edu/ardrone2_ibvs/
Views: 478 Vision4UAV
optiPilot
 
02:00
This 400-gram flying wing flies autonomously using a vision system made of 5 optical mouse sensors. It uses a fly-inspired control strategy to avoid colliding with various types of ground, trees and buildings. http://lis.epfl.ch/microflyers
Views: 8539 jzuffere
Aggressive Flight 2017
 
01:29
This video presents an autonomous 250 g quadrotor performing aggressive maneuvers using a qualcomm snapdragon flight and relying only on on-board computation and sensor capabilities. The control planning and estimation tasks are solved based on the information provided by a single camera and an IMU. We show aggressive trajectories around poles and narrow window gaps at different inclinations. Our system is able to traverse narrow gaps requiring accelerations up to 1.5 g and roll and pitch angles up to 90 degrees with velocities of 5 m/s. The current approach does not require any switching control strategy and it is fully based on the information of only a single camera and IMU. This is the first time that aggressive maneuvers are solved with such a small footprint vehicle, using only on-board sensors and without relying on external motion capture systems.
Views: 59808 Vijay Kumar
UAVIA Robotics Platform : Connected & Autonomous Drone Swarm with Smokes on !
 
02:13
This summer, Uavia's team had a little fun : to celebrate France's victory on the FIFA World Cup, we've demonstrated our platform's swarming capabilities with smokes on !
Views: 502 UAVIA
RI Seminar: M. Ani Hsieh : Exploiting the Environment to Improve Autonomy
 
01:11:41
M. Ani Hsieh Associate Professor, Drexel University Exploiting the Environment to Improve Autonomy: Robots in Geophysical Flows Abstract Different from many aerial and ground robots, underwater robots operate in a communication and localization-limited environment where their dynamics are tightly coupled with the environmental dynamics. While the tight-coupling between vehicle and environment dynamics makes control challenging, it provides a unique opportunity for robots to exploit the environmental forces to improve and prolong their autonomy. In this talk, I'll show the limitations of existing air and ground based strategies and present our efforts in improving vehicle autonomy by better understanding the dynamics of the geophysical fluid environment. The talk will describe our efforts in using robot teams to track coherent structures. Coherent structures are of great importance since they give us a way to map and represent the dynamics of the fluid environment. I will then show how this information can then be exploited to develop more efficient control and coordination strategies for networks of AUVs/ASVs operating in these environments. Speaker Biography M. Ani Hsieh is an Associate Professor in the Mechanical Engineering & Mechanics Department at Drexel University. She received a B.S. in Engineering and B.A. in Economics from Swarthmore College in 1999 and a PhD in Mechanical Engineering from the University of Pennsylvania in 2007. Her current work focuses on developing a general control and coordination framework for distributed sensing and monitoring of dynamic and uncertain environments by mobile robot teams. She is a recipient of a 2012 Office of Naval Research (ONR) Young Investigator Award and a 2013 National Science Foundation (NSF) CAREER Award.
Views: 1438 cmurobotics
Cellular-Aided Inertial Navigation
 
02:15
This video presents the first experimental demonstration of an unmanned aerial vehicle's (UAV's) inertial navigation system (INS) being aided by a cellular signal of opportunity (SOP). While GPS is available, the UAV uses the INS aided by GPS and cellular signals to navigate while simultaneously mapping the cellular SOP. When GPS signals become unavailable, the UAV navigates exclusively with the INS aided by the cellular SOP while simultaneously mapping the cellular SOP (i.e., performing radio SLAM). Results demonstrate that the exploitation of free ambient cellular signals in the environment significantly reduces INS errors in the absence of GPS and bounds the INS drift. http://aspin.ucr.edu
Views: 471 ASPIN Laboratory
Control of Mobile Robots- 2.4 Sensors
 
07:35
About the Course This course investigates how to make mobile robots move in effective, safe, and predictable ways. The basic tool for achieving this is "control theory", which deals with the question of how dynamical systems, i.e., systems whose behaviors change over time, can be effectively influenced. In the course, these two domains - controls and robotics - will be interleaved and we will go from the basics of control theory, via robotic examples of increasing complexity - all the way to the research frontier. The course will focus on mobile robots as the target application and problems that will be covered include (1) how to make (teams of) wheeled ground robots avoid collisions while reaching target locations, (2) how to make aerial, quadrotor robots follow paths in the presence of severe disturbances, and (3) how to locomotive bipedal, humanoid robots. About the Instructor(s) Magnus Egerstedt is a Professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology, where he has been on the faculty since 2001. He is an award-winning teacher, with awards from both Georgia Tech and Harvard University. Dr. Egerstedt received the M.S. degree in Engineering Physics and the Ph.D. degree in Applied Mathematics from the Royal Institute of Technology, Stockholm, Sweden, and the B.A. degree in Philosophy from Stockholm University. Dr. Egerstedt's research interests include motion planning, control, and coordination of (teams of) mobile robots, and he is the director of the Georgia Robotics and Intelligent Systems Laboratory (GRITS Lab). Magnus Egerstedt is a Fellow of the IEEE and a recipient of the CAREER Award from the U.S. National Science Foundation.
Views: 6428 mouhknowsbest

Example annotated bibliography nursing
How to start a cover letter engineering intern
Vthumb application letters
Inter cover letter
Cover letter for demotion