Home
Videos uploaded by user “roboticsqut”
OpenRatSLAM: an open source brain-based SLAM system
 
02:01
OpenRatSLAM is an open source version of RatSLAM. Developed at the Queensland University of Technology (QUT) in Brisbane, Australia. https://wiki.qut.edu.au/display/cyphy/RatSLAM See the following link for more information on how RatSLAM works: http://www.youtube.com/watch?v=t2w6kYzTbr8 Code and datasets available from: http://code.google.com/p/ratslam/ D Ball, S Heath, J Wiles, G Wyeth, P Corke and M Milford, "OpenRatSLAM: an open source brain-based SLAM system", Autonomous Robots, 2013. http://link.springer.com/article/10.1007%2Fs10514-012-9317-9
Views: 5844 roboticsqut
AgBot Field Trials - Emerald Australia
 
02:57
Some footage of our agricultural robot platform taken during recent field trials on our industry partner's grain farm near Emerald Australia.
Views: 2988 roboticsqut
Vertical Infrastructure Inspection using a Quadcopter and Shared Autonomy Control.
 
02:55
This video shows a shared autonomy control scheme for a quadcopter that is suited for inspection of vertical infrastructure --- tall man-made structures such as streetlight or electricity poles or the exterior surfaces of buildings. Current approaches to inspection of such structures is slow, expensive, and potentially hazardous.
Views: 1541 roboticsqut
LEGO Underwater Vehicle Test 1
 
05:24
This is a demonstration of an Underwater Vehicle using the LEGO Mindstorms NXT brick and other LEGO components.
Views: 313287 roboticsqut
ICRA2014 - Online Self-Supervised Multi-Instance Segmentation of Dynamic Objects
 
02:50
Video of results from our object segmentation method [1]. In contrast to many tracking-by-detection based methods, our system is able to detect dynamic objects without any prior knowledge of their visual appearance shape or location. Furthermore, the classifier is used to propagate labels of the same object in previous frames, facilitating the continuous tracking of individual objects based on motion. [1] A. Bewley, V. Guizilini, F. Ramos, and B. Upcroft, “Online Self-Supervised Multi-Instance Segmentation of Dynamic Objects,” in International Conference on Robotics and Automation, 2014.
Views: 1140 roboticsqut
Profiling Float Test
 
01:17
Initial testing of a profiling float prototype. The vehicle is designed to mechanically alter its buoyancy to move vertically through the water column. Vehicle locomotion is provided by ambient currents in the environment. Initial testing has validated that the vehicle can operate continuously for one week, surfacing once every 24 hours to transmit data (temperature, salinity, and depth profile) and update the mission plan.
Views: 809 roboticsqut
Vision-Only Autonomous Navigation Using Topometric Maps
 
05:03
Vision-Only Autonomous Navigation Using Topometric Maps. For pose graph ROS software see: - 'cyphy_vis_slam' -- Video www.youtube.com/watch?v=Yjo_k3ExX9k -- Stack www.ros.org/wiki/cyphy_vis_slam This demonstrates vision-only navigation in a challenging indoor environment. The system exploits the local accuracy of visual odometry in creating local metric maps, and uses pose graph SLAM, visual appearance-based place recognition and point cloud registration to create the topometric map. Metric navigation is performed through neighboring local metric maps directed at achieving a global topological goal.
Views: 1694 roboticsqut
Learning Elastic Joint Robot Control
 
01:40
Video of experiment for learning to control a three degree of freedom series elastic robot arm. The system incrementally learns a model of itself and uses this to track a figure of eight trajectory. Results presented at Australasian Conference of Robotics and Automation 2012.
Views: 218 roboticsqut
Visual Slam with the CyPhy Lab's VSLAM ROS Stack
 
02:12
The cyphy_vis_slam ROS stack in action over a ~100m indoor loop. This video provides a brief overview of the components of the stack before showing how the software performs in a challenging indoor environment.
Views: 2822 roboticsqut
Robotronica: Guiabot navigation in a crowded space
 
03:13
Metric navigation using Kinect Pointclouds and Laser Range Finder. We found that the Kinect provided valuable information on obstacles that have a smaller footprint then their vertically projected exterior (eg. Chairs) while the LFR is required to detect short objects that appear under the vertical Field of View of the Kinect (eg. Children).
Views: 471 roboticsqut
Crown-Of-Thorns Starfish Robot (COTSBot )
 
01:31
https://wiki.qut.edu.au/display/cyphy/COTSBot Crown-Of-Thorns Starfish (Acanthaster planci) are described as one of the most significant threats to the Great Barrier Reef. Since the 1960's, land-based nutrient runoff has accelerated outbreaks of COTS which are destroying large areas of reef. With few natural predators, traditional control of COTS required manually injecting the starfish in excess of 10 times with a biological agent. In 2014, a new agent was released, which was developed by the James Cook University (JCU) requiring only one injection per starfish. This advancement provided the stimulus for us to revisit automated (robotic) COTS population control and monitoring. The intention of COTSbot is to provide a proof-of-concept robotic system that consolidates prior and ongoing research into image-based COTS detection, robotic vision, manipulator control, and shallow water Autonomous Underwater Vehicle (AUV) design, navigation and control, to directly facilitate COTS reduction. This multi-stage project will validate and demonstrate AUV performance to stakeholders and ensure the system components are a useful and flexible enabling foundation technology for environmental monitoring beyond the problem of COTS control.
Views: 8570 roboticsqut
Mechatronics Team Project 1 - ENB229 Line Following Robot Demo
 
00:26
Mechatronics Team Project 1 - ENB229 is a subject within the Mechatronics Course at Queensland University of Technology. In 2012, students (in teams of 4) built a line-following robot from scratch with a budget of $90. Students were required to design the whole platform including the base which was laser cut, design the PCB which they had to populate, and write all the software. ENB229 is taught by Ben Upcroft. Video by Ben Upcroft, footage shot by Chris Lehnert. Music by www.pacdv.com/sounds/
Views: 1371 roboticsqut
Robotic Vision Summer School
 
02:38
Kioloa, Australia, 12-17 March 2017 http://roboticvision.org/events/rvss-summer-school
Views: 68 roboticsqut
A novel energy efficient controllable stiffness joint
 
00:31
A new joint design for energy efficient legged locomotion. Critical for the future of robot mobility. The hopping frequency can be changed by changing the position of the spring. Full details on how it works here in the following research paper. D Ball, P Ross, J Wall and R Chow, "A novel energy efficient controllable stiffness joint", International Conference on Robotics and Automation (ICRA), 2013.
Views: 571 roboticsqut
Building 3D Reconstruction from Airborne Visual Odometry
 
00:23
High detail 3D reconstruction of trees and an aircraft ahngar from low-altitude visual data. A downward facing camera recording at 30Hz is flown on a fixed wing UAV and flown at an altitude of approximately 200ft over the flight area. For more information on the method used to generate this reconstruction, please see the paper titled "Large Scale Monocular Vision-only Mapping from a Fixed-Wing sUAS" http://eprints.qut.edu.au/55403/1/55403A.pdf The dataset used to generate this reconstruction is available for public download at https://wiki.qut.edu.au/display/cyphy/Kagaru+Airborne+Dataset
Views: 836 roboticsqut
Matching 1973 and 2002 traverses of Hollywood Boulevard filmed by Ruscha
 
01:06
Just for fun, using a new and improved single-frame version of SeqSLAM (now even more of a misnomer), we matched the 1973 and 2002 footage shot by Ruscha of Hollywood Boulevard - http://youtu.be/6KIvGMVhaPs. All matching is performed using single (no sequences) 64x16 pixel grayscale images. Some minor image cropping was performed to crudely compensate for differences in field of view. SeqSLAM references: M Milford, G Wyeth "SeqSLAM: Visual route-based navigation for sunny summer days and stormy winter nights", IEEE International Conference on Robotics and Automation, 2012. M Milford, "Visual Route Recognition with a Handful of Bits ", Robotics: Science and Systems, 2012.
Views: 331 roboticsqut
Coral Reef Reconstruction Close-up
 
00:15
A very high resolution 3D mesh of a section of coral reef generated from visual odometry. Dataset credit to the Australian Centre for Field Robotics, University of Sydney.
Views: 314 roboticsqut
teamQUT RobotX 2014 teaser
 
01:03
Queensland University of Technology's 2014 Maritime RobotX Challenge teaser video
Views: 2715 roboticsqut
Scansorial (Climbing) Robotic Platform
 
01:31
A prototype robotic platform suitable for climbing vertical and very steep surfaces. It uses a tri-pedal gait, incorporating individually compliant micro-spined toes. This is the culmination of a 10 week Vacation Research Experience (VRES) at QUT as an undergraduate Mechatronics Engineering student.
Views: 239 roboticsqut
Farmland 3D Reconstruction from Airborne Visual Odometry
 
00:24
High detail 3D reconstruction of farmland from low-altitude visual data. A downward facing camera recording at 30Hz is flown on a fixed wing UAV and flown at an altitude of approximately 200ft over some recently back-burned farmland. For more information on the method used to generate this reconstruction, please see the paper titled "Large Scale Monocular Vision-only Mapping from a Fixed-Wing sUAS" http://eprints.qut.edu.au/55403/1/55403A.pdf The dataset used to generate this reconstruction is available for public download at https://wiki.qut.edu.au/display/cyphy/Kagaru+Airborne+Dataset
Views: 400 roboticsqut
4 NAOs Dancing demonstration
 
01:33
4 NAOs Dancing demonstration in QUT cyphy Lab.
Views: 332 roboticsqut
teamQUT RobotX Field Test 23April2014
 
00:50
TeamQUT field test video from the 23 April 2014. TeamQUT is the Queensland University of Technology's entrant in the 2014 Maritime RobotX Challenge.
Views: 2803 roboticsqut
3D Tracking of Water Hazards with Polarized Stereo Cameras
 
01:06
This research involved using a novel stereo-polarization system for detecting and tracking water hazards based on both polarization and color variation of reflected surfaces, with evaluation on challenging new wet weather road datasets, and is presented in "3D tracking of water hazards with polarized stereo cameras" at ICRA2017. This work was led by Chuong Nguyen at the Australian National University in collaboration with QUT, as part of the Australian Centre for Robotic Vision (roboticvision.org) The preprint is available here: https://arxiv.org/abs/1701.04175
Views: 124 roboticsqut
Towards Unsupervised Weed Scouting for Agricultural Robotics
 
02:38
[pdf] https://arxiv.org/abs/1702.01247 Towards Unsupervised Weed Scouting for Agricultural Robotics David Hall, Feras Dayoub, Jason Kulk, Feras Dayoub, Chris McCool. IEEE International Conference on Robotics and Automation (ICRA), 2017
Views: 139 roboticsqut
Kagaru Airborne Vision Dataset
 
01:10
This dataset is available for public download at https://wiki.qut.edu.au/display/cyphy/Kagaru+Airborne+Dataset Stereo vision data set taken using a synchronized stereo camera pair mounted in the fuselage of a radio-controlled aircraft. Stereo baseline: 750mm. Dataset image resolution: 1280x960. Dataset Framerate: 30Hz. XSens INS/GPS captured simultaneously for ground truth. Framerate and resolution reduced for this video.
Views: 587 roboticsqut
RGBD-Enabled People Following Robot
 
00:48
The ROS people_tracker package in action.
Views: 1776 roboticsqut
FSR2013VideoDemo
 
03:03
Views: 134 roboticsqut
Making People-Centric Maps
 
03:11
This video shows the process of making people-centric maps from a series of trajectories gathered onboard the robot by following people using an RGB-D sensor. Those trajectories are then clustered in a probabilistic cluster tree, and samples are drawn from the cluster tree and fed to a map creation process that is similar to occupancy grid building.
Views: 732 roboticsqut
ICRA2014 Video Demo
 
01:01
Views: 152 roboticsqut
AIM13 0133 VI i
 
01:42
Views: 81 roboticsqut
MikroKopter Autonomous Hovering and Waypoints Following using ROS
 
03:22
MikroKopter Autonomous Hovering and Waypoints Following using ROS
Views: 388 roboticsqut
QUT Cyphy Lab. Autonomous GuiaBot docking demonstration
 
03:08
QUT Cyphy Lab. Autonomous GuiaBot docking demonstration
Views: 226 roboticsqut

Here!
Here!
Here!
Here!
Omg what the fuck