Universal Robotics

From Infogalactic: the planetary knowledge core
Jump to: navigation, search
Universal Robotics, Inc.
Industry Automation; Machine Vision; Software; Artificial Intelligence; Big Data; Cybernetics
Founded 2001, Commenced Operations 2008
Headquarters Nashville, Tennessee
Key people
David A. Peters, CEO
Hob Wubbena, VP
Richard Alan Peters II, Ph.D., CTO
Products Neocortex; Spatial Vision, Unlimited Depalletization, 3D Inspection, engineering services
Website http://www.universalrobotics.com

Lua error in package.lua at line 80: module 'strict' not found. Universal Robotics, Inc. is a software engineering and robotics company headquartered in Nashville, Tennessee.[1][2] The company offers state-of-the-art artificial intelligence with multi-dimensional sensing and motion control to help companies automate processes, from making machines more flexible to analyzing big data.[3]

Founded in 2008, the company specializes in complex processes not previously automated.[4] Universal’s flagship intelligence software, Neocortex, enables robots to perform tasks too costly, dangerous or impossible for humans to undertake. The technology was funded by DARPA and NASA, and was originally co-developed through a 7-year[5] partnership between NASA and Vanderbilt University[4] and is employed in NASA’s Robonaut.[3]

By combining the Neocortex intelligence platform with modular sensing and control software products, Universal Robotics currently provides flexible applications for materials handling. Today’s software products include 3D machine vision products[6] (SpatialVision, Spatial Vision Robotics, Spatial Vision Inspection, and automated robot programming (Autonomy). Applications[7] include Unlimited Box Moving, Unlimited Depalletization, Random Bin Picking, Random Bag Picking, and 3D Inspection.

Products and services

Universal Robotics offers three modular software product families: Neocortex® provides real-time intelligence. Spatial Vision® performs multi-dimensional sensing (Spatial Vision Robotics – 3D vision guidance, Spatial Vision Inspection – 3D visual inspection). Autonomy provides automated enhanced control of robots and machines.[8]

Neocortex

Neocortex is a form of Artificial Intelligence (AI) that allows mobile machines, such as robots, to learn from their experiences in the physical world rather than being programmed to act.[9]

Traditional AI gives robots programmed actions corresponding to variables, leading to failure each time an unprogrammed variable is encountered. Neocortex is based on the pattern of learning in nature which is common to all creatures.[10] The patent-protected software allows a machine to develop its own understanding from sensing and acting in the physical world, using information from up to 70 channels of sensor data.[4] NVIDIA GPUs are used to speed up processing.[11]

With Neocortex, machines learn from their experiences. It enables robots to perform nearly any task that requires adaptive human input. Neocortex allows a machine to determine its actions by remembering what worked and failed in past attempts. It responds dynamically to change with real-time sensory input and uses memory to match what is known with what it is learning.[12] Its database compounds over time allowing for adaptation. With enough experience, Neocortex can enable a machine to draw correlations to attempt an entirely new solution to a given task.[13]

Neocortex technology was developed at Vanderbilt University and NASA, where it was used as “brain” of Robonaut.[14][15] Today, Neocortex is enabling machines to perform highly specialized, automated tasks that require them to react and adapt to their environments.

For example, in the materials handling industry, Neocortex helps machines adapt to mixed-size boxes for palletizing and de-palletizing. Traditional AI has not typically worked in this area because of the difficulty in programming for every circumstance.[4] Since Neocortex helps machines learn, it can adapt to a highly variable palletization process.[16]

3D vision software--Spatial Vision

Universal’s Spatial Vision line of products was created during the development of Neocortex.[9] The products provide 3D vision and include Spatial Vision, Spatial Vision Robotics, and Spatial Vision Inspection.[17]

3D vision systems have many benefits over 2D, including better accuracy and object identification. Although technology advances have reduced expense and complexity, slow adoptions exist because of long-standing perceptions that systems are costly and difficult to maintain.[18]

Spatial Vision software combines the images from two, off-the-shelf USB webcams to determine a point’s 3D coordinates.[19] This 3D data can be used to measure, identify objects, and calibrate to help guide robots with higher accuracy than 2D systems. Spatial Vision provides 0.1 sub pixel accuracy[20] but does not require precision mounting or specialized cameras, which makes it easy to set up[19] and costs a fraction of traditional 3D vision systems.[9]

The product provides programmatic interfaces to 3D calibration files for custom C, C++ and MATLAB applications.[20] It supports GigE Allied Vision cameras and accuracy tools, such as snap-to-corner measurement assistance and accuracy calculator displays.[21]

File:UR Lab SDA10.jpg
Motoman robot in Universal Robotics' lab.

Spatial Vision can be used for tasks ranging from engineering applications to motion capture to improved facial recognition.[11] The system also can be used to measure situations such as in-store foot traffic patterns as well as scientific applications requiring object tracking and visual analytics.[18]

It can be deployed in any setting where a pair of cameras can be installed, including manufacturing lines, warehouses, laboratories, office buildings and department stores.[4]

Universal also partnered with Logitech to launch the Spatial Vision Logitech Bundle, which includes the Spatial Vision software, two webcams and a pair of 3D glasses.[1]

Spatial Vision Robotics provides 3D vision guidance software, which is specially engineered to guide robots. It tracks moving machinery being controlled with Spatial Vision’s 3D data relative to its surroundings and objects of interest.[18] The software offers real-time vision guidance for random parts picking, pallet sorting, automated kitting and box moving (palletization and depalletization).[22]

As part of an ongoing collaboration with Motoman Robotics, a division of Yaskawa America, Inc., Universal launched MotoSight 3D Spatial Vision, a 3D vision system for cost-effective, flexible and scalable real-time guidance for Motoman robots.[1][9] The system determines six degrees of object position and pose information (X, Y, Z, Rx, Ry, Rz) and is accurate within 2–4 mm with off-the-shelf Logitech 9000 webcams. Accuracy can be improved by substituting webcams with GigE cameras.[23]

Spatial Vision Inspection provides 3D inspection of objects up to the size of a pallet. It improves quality by reducing the variability of visual inspection while inspecting at production speeds. In the case of pallets, it identifies a wide range of defects such as raised nails, damaged wood, split or loose boards, or missing wood.[24]

Spatial Vision Inspection creates an accurate 3D image of the object with the proper placement of a combination of cameras and other sensors. This enables clarity even at the edges of the field of view where most the damage to pallets occur.[25]

In 2011, Spatial Vision Inspection provided the first industrial application using the Microsoft Kinect structured light sensor to provide real-time 3D. Then, in 2012, it was the first industrial application using four simultaneous Microsoft Kinect structured light sensors.[26]

Both Spatial Vision Robotics and Spatial Vision Inspection use a variety of sensors and cameras for multi-D sensing, such as structured light sensors, camera pairs (stereopsis), webcams, lasers, Light Detection And Ranging (LIDAR), and Time Of Flight sensors.[12]

Automated robot programming--Autonomy

Universal’s Autonomy is motion control software that automates robot programming[14] for moving a robot at high speed using a variety of sensor inputs. It provides real-time autonomous reaction for robots as well as motion planning and collision avoidance. Autonomy couples Spatial Vision Robotics with robot kinematics to allow a robot to react dynamically to changes in object positioning. For example, the software will allow a paint-spraying robot to maintain a consistent distance from an assembly line part swaying on a moving cable, which reduces over-spray.

Applications

Universal Robotics’ applications include Unlimited Box Moving and Unlimited Depalletization, Random Bin Picking, Random Bag Picking, and 3D Inspection.[7]

Unlimited Box Moving & Unlimited Depalletization application

Universal Robotics recently introduced on-the-fly handling of boxes of any size, a new area of flexible automation for logistics. It moves up to 1,400 cases per hour of cartons never seen before in any orientation with any combination of labels.[27] The application works independently of box variability, including size or condition, weight, location or orientation within the work cell, label quantity/type, or box graphics or color. The boxes can also be in any random orientation or location.[28]

The application can move an unlimited number of boxes – palletizing or depalletizing – whether on the floor, on a moving conveyor, or in a container. It unloads partial, mixed, or full pallets of loosely or tightly packed boxes – regardless of the number of layers and handles single, double and triple picks on the fly. It can move boxes from floor to conveyor, table to pallet, pallet to conveyor, truck to conveyor, or assembly line to staging area.

Its flexibility enables picking and placing a wide range of box sizes from 6 inches up to 48 inches. The application provides depalletization of pallets up to 48 inches x 48 inches and 60 inches high. Stacking of the cases can be in any orientation and order and layers can be fully packed to partially filled or homogeneous to mixed. Boxes can be plain or have a variety of shipping labels, tape, lettering, or graphic designs. Boxes may contain frozen goods or require special handling for fragile contents.

Random Bin Picking application

The Random Bin Picking application enables a robot to automatically move a number of randomly placed parts at typical speeds, regardless of their orientation or how deep the stack. It uses a suite of sensors that integrates off-the-shelf structured light sensors and pairs of cameras for stereoscopic vision.[29]

The standard application moves one part in any orientation at up to 30 per minute with standard motion control. Whether loosely or tightly packed, either on the floor, conveyor or container, the parts can be in any orientation. It also provides 3D guidance to the robot regardless of the presence or type of labels or material type and whether the parts are individually placed on a flat or randomly packed tightly in bin up to 48 inches deep. This cost-effective approach eliminates expensive fixturing and automated tables and works well under varying light conditions. Optionally, Universal’s Random Bin Picking can handle up to 3 parts in any orientation with any combined mix of SKUs per layer. Optional high-speed sensor servoing can further increase throughput where required.

Random Bag Picking application

The Random Bag Picking application enables a robot to automatically move a number of randomly placed bags at typical speeds, regardless of their orientation or how many layers are ‘piled’ together. It uses a cost-effective suite of sensors that integrate off-the-shelf structured light sensors and pairs of cameras for stereoscopic vision.[30]

The standard application moves one bag in any orientation at up to 12 per minute with standard motion control. Whether loosely or tightly packed, on the floor, conveyor or container, the bags can be in any orientation. The application dynamically provides 3D guidance to the robot for bags regardless of labels or material type. Bags can be stacked in unlimited number of layers up to 60 inches high. This cost-effective approach eliminates expensive fixturing and automated tables and works well under varying light conditions. Optionally, the Random Bag Picker can handle up to 3 boxes in any orientation with any combined mix of SKUs per layer. Optional high-speed sensor servoing can further increase throughput where required.

3D inspection application

Traditionally, pallet inspection has been done visually with varying levels of manual handling or automated machinery, requiring the pallet to be lifted and flipped to see all surfaces. Stringent and frequent audits are required to reduce the variability of visual inspection and traditional 2D does not offer reliable inspection.

The 3D inspection application[31] replicates manual inspection through an automated 3D vision system, which flexibly adapts to a pallet structure. It quickly identifies a wide range of defects, including raised nails, and wood damage – whether split, loose, or missing – at a productive line speed. CHEP, the global leader in pallet and container pooling services, uses this application for automated 3D pallet inspection worldwide.[24]

Engineering

Universal’s engineering team has engineering expertise in technologies related to sensing, manipulation and artificial intelligence. The company offers customers world class engineering services in the following areas:

Leadership

Universal Robotics was founded and is led by David Peters, CEO[33] and his brother[14] Dr. Alan Peters, CTO.[34] Dr. Peters is the principal architect of Neocortex and is an Associate Professor at Vanderbilt University in Nashville, Tennessee.[35] Hob Wubbena is the company’s Vice President of Strategic Planning and Marketing,.[36][37]

References

  1. 1.0 1.1 1.2 Lua error in package.lua at line 80: module 'strict' not found.
  2. Hogan, Hank. (November/December 2010). "Universal Robotics' 3D Vision." Robotics Business Review: 23-25.
  3. 3.0 3.1 " Un iversal Robotics Introduces Neocortex, "Software with an IQ." Yahoo! Finance. Retrieved 8 June 2011.
  4. 4.0 4.1 4.2 4.3 4.4 Lua error in package.lua at line 80: module 'strict' not found.
  5. http://www.robotics.org/content-detail.cfm/Industrial-Robotics-Featured-Articles/Robotics-Vision-at-a-Glance-The-Dos-Don-ts-and-Applications/content_id/4189
  6. http://www.universalrobotics.com/products
  7. 7.0 7.1 http://www.universalrobotics.com/urapplications
  8. Fifer, Rolf. (1999). "Understanding Intelligence." The MIT Press.
  9. 9.0 9.1 9.2 9.3 Lua error in package.lua at line 80: module 'strict' not found.
  10. Fifer, Rolf. (1999). "Understanding Intelligence." The MIT Press.
  11. 11.0 11.1 Lua error in package.lua at line 80: module 'strict' not found.
  12. 12.0 12.1 http://www.robotics.org/userassets/riauploads/file/TH_RIA_David_Peters.pdf
  13. http://www.universalrobotics.com/neocortex
  14. 14.0 14.1 14.2 Lua error in package.lua at line 80: module 'strict' not found.
  15. Ambrose, Rob (July 2000). "Robotnaut: NASA's space humanoid." IEEE Intelligent Systems 15 (4): 57-63.
  16. "iRobot: A Robot that Learns.". Nashville Technology Council: Catalyst. 2010.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. 18.0 18.1 18.2 Lua error in package.lua at line 80: module 'strict' not found.
  19. 19.0 19.1 Lua error in package.lua at line 80: module 'strict' not found.
  20. 20.0 20.1 Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. 24.0 24.1 Lua error in package.lua at line 80: module 'strict' not found.
  25. Lua error in package.lua at line 80: module 'strict' not found.
  26. http://www.universalrobotics.com/urcompany
  27. http://www.universalrobotics.com/random-box-mover
  28. http://www.prweb.com/releases/neocortex-box-moving/unlimited-depalletizing/prweb10899645.htm
  29. http://www.universalrobotics.com/random-bin-picking
  30. http://www.universalrobotics.com/random-bag-picking
  31. http://www.universalrobotics.com/3D-inspection
  32. Lua error in package.lua at line 80: module 'strict' not found.
  33. Lua error in package.lua at line 80: module 'strict' not found.
  34. Lua error in package.lua at line 80: module 'strict' not found.
  35. Lua error in package.lua at line 80: module 'strict' not found.
  36. Lua error in package.lua at line 80: module 'strict' not found.
  37. Lua error in package.lua at line 80: module 'strict' not found.

External links