Line 889: | Line 889: | ||
<ul class="description-text" style="display:none;"> | <ul class="description-text" style="display:none;"> | ||
<li data-part="default">Default Text</li> | <li data-part="default">Default Text</li> | ||
− | <li data-part="x-y-axis"> | + | <li data-part="x-y-axis"> |
− | <li data-part="z-axis"> | + | To realise the movement of the head in the x-y-plane the head is mounted onto a 3D-printed connection which is fitted between two crossing 6 mm aluminium linear rods. To ensure a smooth gliding of the connection on the linear rods two <i>LM6UU linear bearings</i> were applied. Also, the crossing 6 mm aluminium linear rods are clipped on four <i>UM2 Ultimaker 2 Injection Sliding BLocks</i> which are themselves gliding on 8 mm steel linear rods. Two <i>F688ZZ flanged ball bearings</i> per 8 mm steel linear rod and mounted in <i>SK16 smooth rod supports</i> ensure that they are able to rotate without too much friction. Furthermore, to couple the rotation of the stepper motors to a movement of the sliding blocks timing belts are used. These timing belts connect the stepper motor with the 8 mm steel linear rods with the usage of <i>GT2 Pulleys</i>. Timing belts that are fixed in the sliding blocks range from one linear rod on one side of the robot to a parallel one on the oppsing side. Therefore, an amount of two timing belts per axis and one timing belt per motor are a necessity for a proper movement in the x-y-plane. |
− | <li data-part="optics"> | + | </li> |
− | <li data-part="syringepump"> | + | <li data-part="z-axis"> |
− | <li data-part="chassis"> | + | For the motion in z-direction a threaded rod is coupled in a one to one ratio to a stepper motor and vertically mounted with two ball bearing shaft couplings. The rotation of the threaded rod is then transmitted into a vertical movement of the sample stage via a ball screw. For a greater stability and a more balanced force distribution two additional 12 mm aluminium rods help to guide the vertical movement of the sample stage. Besides, one <i>LM12LUU linear bearing</i> per vertical axis is used for nearly frictionless motion and an even more balanced force effect on the vertical axis. The mounting for the latter is attached on the sample stage and particularly manufactured for again a balanced force distribution. The sample stage itself consists of an aluminium framework with an infrared-transparent plexiglass. To minimize the weight the thickness of the aluminium frame and the plexigass was chosen to be three millimeter so that the decrement of stability is not too high. |
− | <li data-part="electronics"> | + | </li> |
+ | <li data-part="optics"> | ||
+ | The combination of lightbox and the head builds the fundamental entity to locate individual samples that are put into the robot for observation and detect fluorescent light of samples that are out of non natural amino acid. The localization works with the lightbox via built in infrared LED's whose light diffuses inside a diffusion plate, generating an evenly lightsource. The then emerging shadows can be detected with the head <i>Raspberry Pi-Cam</i>. The latter is attached to the head of the robot that additionally consists of optical components such as four high power LED's, various filters, a lens mounting and a tip that is connected to the syringe pump. As already mentioned the <i>Raspberry Pi-cam</i> is in charge of detecting different sample locations and fluorescent light of the mVenus protein. While the high power LED's ensure the excitation of the protein the filters, that are composed of a longpass-filter and sunglasses eliminate reflected light such that the camera only detects relevant light-information. The lens mounting works as an autofocus that's able to control definition of the camera. This works by applying different voltages to coils that move the lens inside a modified webcam. | ||
+ | </li> | ||
+ | <li data-part="syringepump"> | ||
+ | The syringe pump is the device of the robot that feeds the bacteria with non natural amino acid (short: nnAA) if it's needed. Its setup is made of a stepper motor that squeezes the nnAA out of the syringe and a 3D-printed framework that holds the syringe. It also includes a threaded rod for transmission between the stepper motor and the syringe. To hold the whole setup at a fixed position and still pipet on the right sample the end of the syringe is connected with the head by a flexible tube. Although it is a relative simple setup it is possible to determine the amount of the nnAA with a resolution of +-14 µl. | ||
+ | </li> | ||
+ | <li data-part="chassis"> | ||
+ | The framework for the robot is built in a form of a simple cuboid made of 30 mm x 30 mm aluminium profiles. These profiles are connected with each other via M8 screws and drilled into ISO metric screw threads. Two extra horizontal aluminium profiles are added on the backside which hold the axes and the threaded rod needed for the vertical motion. Moreover, a thin aluminium frame is attached at the lower part of the 30 mm framework to mount the lightbox onto it. | ||
+ | </li> | ||
+ | <li data-part="electronics"> | ||
+ | The electrical parts used in this robot can be separated into those that directly manage the threedimensional movements and those that are in charge of the optical localization and detection. The threedimensional movements are controlled by an <i>arduino mega 2560</i> microcontroller connected to a <i>RepRap Arduino Mega Pololu Shield 1.4</i> (short: <i>RAMPS 1.4</i>) which enables, for example an intuitive connection of drivers and other electronical devices without using wires. The drivers used in this project are called <i>Pololu - DRV8825 Stepper Motor Driver Carrier</i> and allow the arduino to let the stepper motors make a thirtytwoth step and therefore increases the spatial resolution. Another part involved in the threedimensional movement are the so called endstops that assure that the head won't crash into the boundaries of the robot. The list of parts that control the optical localization and detection comprise a single-board computer with the name <i>Raspberry Pi 3</i> and controls several other electro-optical components. These include a <i>RaspberryPi-cam</i> that locates the samples inside the robot and the fluorescent light of the mVenus proteins as well as four high power LED's needed for the excitation of these proteins. To locate the samples several infrared LED's are used which are directly driven by an <i>ATX power supply</i>. | ||
+ | </li> | ||
</ul> | </ul> | ||
Revision as of 16:21, 14 October 2016
ROBOTICS
ABSTRACT
One task of our project is to monitor and to keep alive manipulated E. Coli before they die, since they aren't able to survive on their own due to their dependence on non-natural amino acid. To catch the moment before the bacteria dies, a flourescent protein was implemented in such a way that it emits light as a warn signal.
Therefore, our team decided to build a fully automatized pipetting robot that's able to locate a set of samples, detect potential light emission and pipet a specific amount of nonnatural amino acid onto the fluorescent sample.
The foundation for the robot lays a 3D-printer due to the easy handling of movements in three dimensions. By controlling these movements with an optical system the autonomy of the robot should be be increased even more.
INTRODUCTION
Development of 3D printers & possibilities
In the 80s Chuck Hull invented the first standardized 3D printer, based on a procedure which is known as stereolithography (SLA, [1]). Moving from SLA to full deposit modeling (FDM) techniques, the 3D printing idea became alive in the Do‑It‑Yourself community. Ever since that time, basic 3D printers are accessible for little money and due to the open source idea of projects like REPRAP [2] affordable for many. In last years project, iGEM TU Darmstadt build already a fully working SLA printer, capable of feeding it with biological engineered plastics [3].
So now, the hardware branch decided to rebuild a clone of the Ultimaker 2 FDM printer [4] and exchange the extruder with a camera and a pipette to create a pipetting robot. Using several open‑source parts and software, it is the idea to establish an easy manageable robot to help the daily biologist's work.
Lab 3.0
Following the concept of the industry 3.0, the automation of simple tasks by robots (refers to the 4th stage of industrial revolution) let us introduce the laboratory 3.0, which enables scientists to concentrate on actual science by the automation of simple, repeating work. This task is fulfilled with intelligent robots, which are enabled to detect, react and response to the experimenter.
Connection to our team
iGEM TU DARMSTADT is a young and dynamic team of engaged researches. We have limited resources, namely time and money and we have to invent our lab work every day again, to reach more and stayfocus on science. We have the possibilities, thanks to iGEM, to experiment on our own ideas and to try reaching the sky (or actual the biologist's equivalent to something really cool). Interdisciplinary opens our mind and sharpens our knowledge for the important things. It is really helpful, if we work together and benefit from each other.
References:
- http://edition.cnn.com/2014/02/13/tech/innovation/the-night-i-invented-3d-printing-chuck-hall
- https://2015.igem.org/Team:TU_Darmstadt/Project/Tech
- http://reprap.org/wiki/About
- http://www.thingiverse.com/thing:811271, jasonatepaint
GOALS
The main task is to develop a machine which is capable to monitor our organisms and their health condition in order to keep them alive. Therefore the machine has to measure the light emittance of the organisms and be capable of dropping liquids into our containers. This has to be independent of the exact position of the container, which requires an automatic tracking system.
The idea is that one places a container somewhere under the machines working area and click a run button of a program. The machine starts its routine by tracking the new container and measuring the light emittance of the organisms. Based on the measurement the machine decides whether to feed the organisms with non‑natural amino acid or not. After a period of time it repeats this routine until the stop button of the program is clicked.
These are only the minimum requirements for our project needs. We decided to go one step further ans designed our machine in such a way, that it serves as a multi purpose platform which is adaptable and easy to modify. Our aim is that it is possible to add new features and software, inviting other scientists to improve our platform and share their ideas with the community.
For example our liquid system can be upgraded to be able to prepare 96-well plates with probes and monitor routines, by using the optic.
Or our measuring head can be changed back to a printer head which allows to 3D print again with just a few changes.
One has a vast room of possibilities, just using the concept of the accurate positioning of a probe in the 3D space.
Due to the fact that we try to stick to widely used open-source software and standard commercial parts, our machine can be easily combined with the most DIY products, making it reusable, flexible and cheap.
In the special case of the TU Darmstadt and the next generations of iGEM competitors we have the idea to develop our technical equipment further from year to year and combining them. Our SLA printer from the last years competition was upgraded and is nearly ready for use again, giving us the possibility to manufacture parts for prototyping in our lab. Also this years project will serve as a starting point for the next years technical development team. New ideas and possibilities have been already discussed and we are looking forward to the next years competition.
SETUP OVERVIEW
FUNCTIONALITY
To fulfill the task of keeping alive the bacteria it loops through a specially designed procedure. Initially the robot scans the internal space for samples by lighting the downside of the shelf space using infrared LEDs and monitoring the shadows of the placed reservoirs with a camera. If the contrast is sufficiently high, which is given due to isolation from light sources from outside, it is able to detect the edges of the mentioned reservoirs, fit a circle onto it and compute the distance between the reservoir and the camera itself. Furthermore it is possible to put an entire rack of reservoirs under observation due to its ability to recognize even rectangles.
Shortly after the detection of the samples the distance data is transferred to the 3D control and the head of the robot moves in direction of the first reservoir. To check whether the bacteria needs more nnAA the robot uses the fluorescence of the protein mVenus that has been inserted into the bacteria. Therefore the robot excites the protein via high power LEDs and detects the emitted photons. To exclude reflected light from the LEDs that would disturb the measurement a longpass filter cuts of the spectrum below the emission peak of the protein. In dependence of the fluorescence signal the robot decides whether it is necessary to pipet nnAA onto the sample. If that's the case the robot moves the samples in z-range just so that the syringe reaches the sample and is able to securely add the nnAA.
Eventually the robot recommences the procedure described above, except for the scanning of the individual positions of the samples, which are saved temporarily until all samples are checked. As long as the robot is activated, connected to a power supply and the syringe pump does not run out on nnAA the robot will loop through this whole process and keep the bacteria alive without a need of human interaction. Nevertheless it is possible to check what the robot is doing via a livestream of the camera visible on the graphical user interface, since there is no other opportunity to look inside the robot itself while it is working.
ACHIEVEMENTS
- Successfully redesign a 3D printer chassis to meet our requirements
- Construct a unique base platform with integrated IR LEDs for measuring and positioning purposes
- Design a measuring probe with a camera device with an integrated optical filter system and LEDs
- Construct a syringe pump system to add liquids down to µl accuracy
- Implementing a unique GUI to run our program featuring a multi threading system optimizing the overall performance
- Implementing an automatic object tracking system including a vector based feedback system for positioning
- Connecting a Raspberry Pi with an Arduino Micro controller by establishing a serial connection between the two devices, allowing a variety of different tasks
- Data of all CADs designed by the TU Darmstadt technical department
- A complete build instruction including a BOM (Bill of Materials incl. prices) and a step by step video tutorial
Mechanics
Optics
Operating range of wavelengths
Fluorescence measurement and filtering
Optical Hardware - Camera Head
Optical Hardware - Lightbox
References
[1]: https://picamera.readthedocs.io/en/release-1.12/
[2]: https://www.plexiglas-shop.com/pdfs/en/212-15-PLEXIGLAS-LED-edge-lighting-en.pdf
Cooling
Software
Marlin
OpenCV
PyQt
Qt is a software tool to develop a GUI (Graphical User Interface). It is available under a commercial license and a open-source license. The software is a cross-platform application framework, which means it runs on the most computer systems like Unix or Windows. The underlying programming language is C++ and Qt can use already existing programming languages like Javascript, making it a powerful tool.
The main idea of Qt is to use a system of signals and slots to have an easy framework to connect displayed elements with underlying functions. Also the re usability of already existing code is enhanced. Every graphical element, for example a button, emits its own signal when its pressed or used. The signal then can be used to trigger an action, for example closing a window. If the signal is not connected to a function nothing will happen, however the signal will be emitted with no consequences. Now it is possible to connect the emitted signal with a desired action, called slot, and the program gets its unique behavior.
Qt is widely used by companies like the European Space Agency, Samsung, DreamWorks, Volvo and many more.
To be able to combine the possibilities of Qt with the simplicity of the Python programming language, PyQt was developed. PyQt is a binding for Python to be able to use the Qt methods within the Python syntax.
To be able to get a direct preview of the constructed GUI, Qt Designer is a helpful tool. It is basically a constructing tool in which it is possible to use the objects as visible ones, making it possible to move them around and arranging them in the desired manner. To later work with the code itself, PyQt uses a method called pyuic(number) which is executed through the terminal. The number in the brackets stands for the version number.
After converting the code one can open the GUI as a regular python script and work with it as usual.
References:
https://riverbankcomputing.com/software/pyqt/intro
https://www.qt.io/
https://en.wikipedia.org/wiki/Qt_(software)
FURTHER DEVELOPMENTS
Due to a tight time schedule from start to the end of iGEM it wasn't possible for us to to realize all ideas and planned developments in respect of improvement of the robot itself and further applications other than its current very specialized task.
First of all it should be mentioned that the current model of the robot is designed to work with only one kind of bacteria culture in virtue of the not solved problem of sterility. For a working process with more kinds of bacteria cultures it is absolutely indispensable to develop a system that is able to avoid all sorts of contamination between the different bacteria. Therefore it would be an option to have an extra reservoir filled with ethanol in which the tip of the syringe can be sterilized between the checks of different samples.
Another modification that would be useful for working with individual bacteria cultures is make the power LED's changeable. This is necessary if the the wavelength of the LED's does not overlap with absorption spectrum of the fluorescent proteins or overlaps with a part of the spectrum that has a very low absorption efficiency.
Moreover, apart from the latter developments it may be useful to add one more syringe pump to the current setup, just so that it would be possible to remove liquid from the samples. Of course, this only makes sense, if the above mentioned idea of a sterility progress is implemented, due to the fact that the tip of the removing syringe has to be inserted into the liquid. And thus a contamination, in case of different bacteria cultures, would occur.
Besides, a usage of the robot except for its “normal” tasks of observing bacteria would be a neat extension. For example a useful modification of the robot to a functional 3D-printer would be convenient, due to its setup that resembles an Ultimaker 3D-printer. Essential alterations would be to replace the sample stage with a heatbed and to replace the current head with a printhead hotend. Since the current head can be clipped it wouldn't be too much of a challenge. Furthermore, a change of the syringe extruder is necessary, if the printer should work with plastics.
An alternative approach is a kind of paste 3D-printer. In this case it wouldn't even be needed to change the head and the syringe, because of the already viscous properties of the paste.