top of page

Poster presenters

​

Isira Vithanage  (University of Moratuwa)

​

Title: Soft worm-like burrowing robot

​

Abstract: Earthworms are agriculturally vital in improving soil drainage and mixing the soil. Earthworms use an interesting motion to move within the soil. Hence, the authors propose an earthworm mimicking soft robots to assist in soil preparation pre-cultivation. A soft worm-type robot can play a vital role in doing this in agricultural applications due to the high flexibility to locomote in confined spaces without damaging the plants. The authors propose the modification of an existing soft earthworm mimicking robot built for in-pipe locomotion for soil burrowing. This robot consists of a front conical actuator, a rear radial actuator, and a middle actuator which expands axially for forward movement. The conical actuator is used for burrowing into the soil, while the radial actuator anchors the robot delivering maximum forward motion via middle actuator extension.

 

​

 

Rana Faryad Ali   (Simon Fraser University)

​

Title: Will a Triad of Soft Robots, Hard Robots, and Drones Solve the Challenges in Agriculture? A Futuristic Perspective

 

Abstract: Agriculture is under pressure due to labour shortages, growing population, climate change, shrinking agricultural lands, and natural resources depletion. These growing challenges require new technologies that could enable smart agricultural practices and meet a rising demand for food production to sustain life on this planet. Agricultural robots such as drones, hard robots and soft robots are becoming friendly partners with farmers to avoid food wastes, provide higher operating efficiency, improved accuracy and detailed monitoring of crops. But many challenges still need to be solved in agricultural robotics. No doubt, drones and hard robotics are employed in agricultural practices for many years, but why we need soft robotics in smart farming and harvesting? What are the challenges to develop soft robots for agricultural practices? Can a single robotic technology is sufficient to solve all challenges in agriculture? In this futuristic perspective, I will discuss how the modern robotics technologies and a triad of soft-robotics, hard-robotics and drones synergistically working with each other can make agricultural practices modern and efficient?

​

​

​

Victoria Oguntosin   (Covenant University)

 

Title: Design of a Pneumatic Soft Actuator Controlled via Eye Tracking and Detection

 

Abstract: The work describes the control of a pneumatic soft robotic actuator via eye movements. The soft robot is actuated using two supply sources: a vacuum pump and an air-supply pump for both negative and positive air supply sources, respectively. Two controlled states are presented: the actuation of the vacuum and air pump. Through eye positioning and tracking on the graphical user interface to actuate either pump, a control command is directed to inflate or deflate the pneumatic actuator. The potential of this application is in rehabilitation, whereby eye movements are used to control a rehabilitation-based assistive soft actuator rather than ON/OFF electronics. This is demonstrated in this work using an elbow-based soft rehabilitation actuator.

​

​

​

Marina DeVlieg   (Northwest Nazarene University)

 

Title: Automated Robotic Fruit Harvesting

 

Abstract: Fruit growers are lacking seasonal orchard workers. The goal of this research is to develop a multi-purpose robotic manipulator platform for orchard use called OrBot. A visual-feedback closed-loop control system was developed that recognizes and harvests apples using a 6 DOF robotic manipulator equipped with a color sensor, depth sensor, and two-finger gripper. Images from the color sensor were filtered based on color and object size to identify apples and conduct arm movement. A correlation was developed to calculate the distance to the target apple using the depth sensor; the arm moved forward the calculated distance and closed the gripper to secure the fruit. The arm carried out a harvesting motion, removing the apple from the tree, returned to a home position, released the fruit, and checked for more apples. All simulations have been done indoors with an artificial tree and apples. The robotic manipulator has consecutively harvested four apples successfully.

 

​

​

Philip Johnson   (University of Lincoln)

 

Title: Design and accurate control of a low-cost soft robotic arm for automated strawberry picking.

 

Abstract: My research project concerns the development and control of a bespoke soft arm targeted at the strawberry harvesting application. The arm will be based on soft robotic technology, using compliant materials and novel soft types of actuation such as pneumatic or granular jamming methods. Multiple soft actuation types are being considered for application to the arm and end-effector individually. The design will involve embedding of customised flexible sensors to provide additional positional feedback so that accurate closed-loop control can be achieved. The design of the soft arm will consider the potential for delivering picked strawberries to the robot base by utilising an internal passage through the soft arm, to reduce the need for moving the arm back to the base after each pick and hence shortening the picking cycle. The soft arm body can be potentially 3D printed from flexible materials to automate the fabrication process and yield a more consistent output.

 

​

​

Miranda Cravetz    (Oregon State University)

 

Title: In-Hand Sensing for Robotic Apple Harvesting

 

Abstract: Robotic systems designed for fruit harvesting face both the challenge of operating in an unstructured environment and the challenge of of manipulating a delicate object. Compliant materials can help with both of these problems, by mitigating the effect of localization error in the former case and by providing a soft interface between the robot and the object in the latter. However, it can be difficult to gather feedback about robotic grasps when using compliant systems, as force sensing and state estimation for these are still open problems. Our work utilizes a custom, sensorized end-effector with compliant flexures and soft fingerpads, in order to enable the multi-modal sensing that is required for the development of sophisticated grasping techniques for selective apple harvesting. Our early experiments have shown that with our system it is possible to detect key events such as fruit slip and fruit separation.

 

 

​

Cameron Darkes-Burkey   (Cornell University)

 

Title: Hydrogel Composites Towards Efficient Soil Hydration

 

Abstract: With water being a critically valuable resource in crop production and with expected water demand to increase in the coming years, hydrogel polymers added to the soil have the potential to alleviate some of the issues associated with agricultural water stress. These super absorbing polymers can help maintain soil hydration and nutrient enrichment, enhance soil mechanical properties, and have possible future applications in soil sensing.

 

​

​

Jaeseok Kim   (Scuola Superiore Sant'Anna)

 

Title: Tactile and RGBD based maturity of strawberry classification using deep learning and transfer learning approach

 

Abstract: Strawberry is one of the popular fruit, and it contains sufficient nutrients. In particular, the taste of this fruit is very sweet when it is fresh and matured adequately. During the harvesting period, humans crop the ripened fruit using various sensory organs (i.e., eyes and touch). Many state-of-art for strawberry classifications evaluate maturity using hyperspectral, RGB, and depth image data to substitute human works. However, only image classification is difficult to acquire satisfying results. For this reason, we propose a model that classifies the maturity of strawberries with tactile sensors and RGBD image data as input using deep learning. We expect that collecting data with optimized hyper-parameter of the network by hyperopt library will enhance the feature extraction to improve the network's performance. The proposed model will evaluate with deep learning architectures such as ResNet and VGGNet as transfer learning.

 

​

​

Arun Sivakumar   (University of Illinois, Urbana-Champaign)

 

Title: Learned Visual Navigation for Under-Canopy Agricultural Robots

 

Abstract: We present a system for visually guided autonomous navigation of under-canopy farm robots. Low-cost under-canopy robots can drive between crop rows under the plant canopy and accomplish tasks that are infeasible for over-the-canopy drones or larger agricultural equipment. However, autonomously navigating them under the canopy presents a number of challenges: unreliable GPS and LiDAR, high cost of sensing, challenging farm terrain, clutter due to leaves and weeds, and large variability in appearance over the season and across crop types. We address these challenges by building a modular system that leverages machine learning for robust and generalizable perception from monocular RGB images from low-cost cameras, and model predictive control for accurate control in challenging terrain. Our system, CropFollow, is able to autonomously drive 485 meters per intervention on average, outperforming a state-of-the-art LiDAR based system (286 meters per intervention) in extensive field testing spanning over 25 km.

 

 

​

Nitin Rai  (North Dakota State University)

 

Title: Streamlining Crop Stand Count Using An Intelligent Flying Machine

 

Abstract: Unmanned aerial systems (UASs) are thriving in agricultural domain because of its promising results in crop phenotyping. Crop phenotyping aims at quantifying physiological traits of crops, e.g., crop stand count, etc. Crop stand count plays an important role in genotype selection. But before crop stand count can be estimated, UASs acquired images are processed on a computer with the help of a software which is time-consuming. We propose an idea of real-time crop stand count on Nvidia based embedded devices mounted on UASs. Integrating an embedded device with UASs would bridge the gap between remotely processing the images and performing stand count. It will allow farmers to accomplish the task of stand count on their mobile devices through live video transmission and vision-based counting working synchronously with the flying machine. This would also pave a way for them to perform advanced image analysis which requires stand count as a foundational step.

bottom of page