بحث عن الروبوتات المستخدمه فى خطوط الانتاج

by Gayle Ehrenman, Associate Editor

industrial robots have evolved considerably since Joseph Engelberger and George Devol’s Unimate was installed at a General Motors plant to extract molded parts from die-casting machines in the late 1950s. Where those early-generation robots were generally highly specialized, single-purpose machines, modern robots have evolved to the point where one can be used for a variety of tasks, with just a change of the tools and the programming. Their precision and accuracy have improved by leaps and bounds, as has the software that powers them, making these automation tools capable of handling tasks as varied as grasping and placing an auto windshield or selecting only perfectly formed cookies for packaging.

The automotive industry was one of the earliest to adopt industrial robots, and continues to be one of its biggest users, but now industrial robots are turning up in more unusual factory settings, including pharmaceutical production and packaging, consumer electronics assembly, machine tooling, and food packaging.

Part of what’s driving that growth is the addition of machine vision systems to the robots. These systems, which use a camera and software, enable the robots to adapt to changing conditions and variability in the production environment.

“It’s very expensive for a company to have to set up a factory line so that parts are presented in the exact same position to a robot over time,” said Dion Spurr, engineering manager for the robot automation group of ABB U.S. in Norwalk, Conn. ABB sells vision-enabled robot systems. “A non-vision-enabled robot will crash or miss if it goes for a part and comes up empty,” Spurr said. This can result in a costly and time-consuming line shutdown or manual intervention.

The robotics industry overall is growing rapidly. A total of 5,316 robots valued at $302.5 million were ordered by North American manufacturing companies from January to March of this year, according to the Robotic Industries Association in Ann Arbor, Mich. An additional 272 robots valued at $18 million were ordered by manufacturing companies outside of North America.

Orders for materials-handling robots, the largest application, posted a 67 percent gain over the first quarter 2004 results. Orders for arc welding applications jumped 76 percent, and for coating/dispensing robots they rose 49 percent, according to RIA’s survey.

The association estimates that approximately 147,000 robots are currently in use in factories in the United States.

Worldwide, however, Japan leads the robot charge. In 2002, the worldwide operational stock of multipurpose industrial robots was estimated at 769,900 robots, of which 350,170 were in use in Japan; 103,500 were in the United States; 233,000 were in the European Union; and 11,640 were located in Asia (excluding Japan) and Australia, according to the International Federation of Robotics in Paris. IFR together with the United Nations Economic Commission for Europe regularly surveys the robotics market.

IFR estimates that the worldwide stock of industrial robots will increase to 875,000 units by the end of 2006, representing an average annual growth of 4.5 percent.

No current market research is available that breaks down vision-enabled versus blind robot usage. However, all the major industrial robot manufacturers are turning out models that are vision-enabled; one manufacturer said that its entire current line of robots is vision-enabled.

The AdeptViper s650 is a six-axis articulated robot with a 653 mm reach and an integrated vision system.

Where blind robots consist of just a robot and a computer-driven controller, vision-enabled robots add a camera for capturing images of components as they travel down the conveyor belt, software for identifying and processing those images, and lighting to make sure the camera captures the best possible image. The cameras can be placed either directly on the robot or above the work cell. The software runs on a Windows-based computer connected to an Ethernet network.

Instead of locking a part’s position in advance to suit a blind robot, the camera in a vision-enabled robot lets the robot see the position of a loose part and adjust itself accordingly to manipulate it. The system calculates a component’s position on the fly, grasps it, inspects it, and moves it to where it needs to go.

ABB U.S. partnered with Braintech Inc. in North Vancouver, British Columbia, to create a commercial out-of-the-box vision-enabled robotic system, called TrueView. It includes an ABB robot and controller, Braintech’s eVisionFactory software, a standard charged coupling device (CCD camera), a Windows-based computer, an integrated lighting system to ensure that the camera can capture clear images, an end-
effector (or robot hand), conveyors, and enclosures for the system.

The ABB/Braintech solution is aimed at the automotive industry, but covers a range of applications, including materials handling, spot welding, machining, applying adhesives or sealant, spray painting, automated assembly, and inspection or identification of parts.

“We can handle any rigid parts,” said ABB’s Spurr. “Parts that are made out of metal, like engine blocks and transmissions, have features that repeat, so they’re perfect for the vision-enabled robots to detect and handle. A bad candidate for our system would be a part made out of a soft material, like rubber,” he added. “Or a substance that shrinks so it’s hard for the vision system to detect consistent features.”

The system integrates a single camera into the robot’s end-effector, and adds a specialized, durable lighting unit that moves with the robot to illuminate the parts. Cabling to connect the camera and robot to the controller runs up through the arm. The system is marketed as “a robot in a box,” and sells for less than $100,000 for the robot and vision system, according to Spurr.

The camera in a vision-enabled robot lets the robot see the position of a loose part traveling along a conveyor and then adjust itself to manipulate it.The camera captures an image of a part as it moves along the conveyor, and transmits that image via an Ethernet network to the Windows-based PC. The eVisionFactory software on the PC analyzes the image to find identifiable features in the part. There are roughly 15 to 20 identifiable features in the average door panel, according to Jim Dara, vice president of sales and general manager for Braintech. The software uses that information to calculate where the part sits in 3-D space (it defines the x, y, z position, and roll, pitch, and yaw angles) and transmits that coordinate data back to the robot, so the robot hand can intercept each part correctly for grasping or performing other processes.

While the process sounds incredibly complex, it moves quickly enough to cause no slowdown or stopping of the production process. “The whole process from image acquisition to feature detection to coordinate transmission back to the robot takes just one or two seconds,” Dara said. He claimed that the process is generally accurate within a tenth of a millimeter.

The TrueView system can link up to five robots under one controller. It can see the differences among several different parts and perform multiple actions with any of the parts as circumstances require.

At Ford’s Windsor, Ontario, plant, where V8 and V10 engines are produced for the Ford F-150 pickup truck and other vehicles, six ABB vision-enabled robot systems were installed during a holiday break. Four systems are dedicated to robots handling cylinder heads that are destined for machining, an early stage in the manufacturing process. The robot cameras and software direct the robots to unload the 40-pound parts from pallets (called “dunnage”) and place them on brackets fixed to a conveyor. They are then transported to work areas for precision machining.

The challenge is that the cylinder heads arrive four to a pallet, in a variety of positions from having shifted in transit. Lacking vision guidance, a non-human-controlled gantry arm can crash into the heads, damage them, and cause the line to shut down—costing an automaker up to thousands of dollars per minute in lost production. With vision-enabled robotics not only is this problem solved, but additional savings can be realized by changing from expensive “exact fixturing” dunnage required by blind systems to less expensive plastic blow-mold dunnage. The potential savings per cell are in the hundreds of thousands annually.

Where ABB’s system is geared toward handling large components, industrial robot manufacturer Adept Technology Inc. of Livermore, Calif., markets its vision-enabled robots for small parts assembly and materials handling applications, such as pick and place, and parts transfer.

“We specialize in handling parts of five pounds or less, in settings that call for speed and precision, such as
assembling disc drives or computer peripherals,” said Tim DeRosett, the manager of product marketing for Adept.

Adept offers vision-enabled robots in a variety of configurations, including SCARA robots (a four-axis robot with rotating elements that move in a single plane) in floor and tabletop models, six-axis robots, and Cartesian robots. For most applications, the image-capture camera that provides the robot with its vision is fixed above the work cell. Adept’s system uses a CCD camera connected to a Windows-based PC. Everything runs on an Ethernet network.

Adept’s iSight system uses PC-based software for vision guidance, robot-to-vision calibration, and motion control. The company typically ships a controller, a robot, and the vision system as a bundle; the user purchases his own PC, according to DeRosett.

The ability to calibrate the camera easily and precisely, and then adjust the robot arm is key, DeRosett said. “How you calibrate the camera determines your field of view. If that’s off, all your vision results are irrelevant,” he said. “After that, it’s vital to process the vision data and transfer the movement coordinates back to the robot in as timely a fashion as possible.”

DeRosett said that Adept’s system can process vision computations in from one-half to two seconds and then adjust the position of the robot in near real-time.

A major commercial bakery uses Adept’s vision-enabled robots to safely package its fresh-baked, delicate muffins. As the muffins come out of their baking pans, they travel by conveyor to the packaging area. There, two vision-
enabled pick-and-place machines do their work. The robots each pick three muffins at a time off a conveyor belt that’s about 24 inches wide, and place the muffins into plastic trays. The plastic trays travel on a conveyor alongside the muffin conveyor for a smooth operation. The movements are gentle so the muffins are not damaged; the robots make between 80 and 120 picks each per minute.

Vision-enabled robots are being used increasingly for consumer electronics assembly, pick-and-place tasks, and food packaging.The robots use Adept’s integrated vision system to locate each muffin, track it as it’s moving, and relay its position and orientation to the controller. The Adept controller tracks the velocity and direction of the conveyor belt, so the robots can be guided to the exact locations of the muffins. It also automatically adjusts the speed of the robot and conveyor so they match up. This enables the robots to pick up the muffins cleanly.

“As the robot locates parts on the conveyor belt, it can also do some sorting and basic inspection,” DeRosett said. “It can eliminate parts that are already broken or that it can’t recognize because they’re misshapen.”

In addition to their precision, vision-enabled robots bring a new level of flexibility to the factory, a key to surviving in a lean manufacturing environment, DeRosett said.

“We’re seeing these robots becoming more popular in factories where there’s a high product mix, such as in consumer electronics and consumer products,” he said. “We have a customer who uses the vision-enabled robots to put customizable cosmetic kits together; the customer maintains a very low inventory of highly customizable product. He can do a product changeover daily because of the flexibility of the robots.”

All it takes to change over the robot system is some fairly basic tooling changes to the robot’s end-effector, and some programming changes in the software.

The combination of speed, relatively low cost, flexibility, and ease of use that vision-enabled robots offer is making an increasing number of factories consider putting another set of eyes on their lines.

شكرا جدا جدا