Difference between revisions of "Team:TU Darmstadt/Hardware"

 
(56 intermediate revisions by 4 users not shown)
Line 1: Line 1:
 
{{Team:TU_Darmstadt/Viki}}
 
{{Team:TU_Darmstadt/Viki}}
 
<html>
 
<html>
<!-- Styles fuer den Drucker -->
 
<style type="text/css">m
 
.this-is-printer {
 
    text-align: center !important;
 
}
 
 
.this-is-printer [id^=printer_] {
 
    transition: opacity ease-in-out 0.25s;
 
    opacity: 1;
 
}
 
 
.this-is-printer .not-selected {
 
    opacity: 0.08;
 
}
 
 
.this-is-printer .selected {
 
    opacity: 1;
 
}
 
 
.this-is-printer .hovered {
 
    opacity: 0.6;
 
    cursor: pointer;
 
}
 
 
.description {
 
    font-family: calibri, sans-serif;
 
}
 
 
@media only screen and (max-width: 800px) {
 
    .this-is-printer svg {
 
        width: 80%;
 
    }
 
 
.pictures-results {
 
    display: flex;
 
}
 
 
.pictures-results > figure > img {
 
    width: 30%;
 
}
 
}
 
 
 
</style>
 
<!-- END -->
 
 
 
<body>
 
<body>
  
Line 112: Line 66:
  
 
<div class="vviki" id="vviki">
 
<div class="vviki" id="vviki">
 +
    <!-- Styles fuer den Drucker -->
 +
    <style type="text/css">m
 +
    .this-is-printer {
 +
        text-align: center !important;
 +
    }
 +
 +
    .this-is-printer [id^=printer_] {
 +
        transition: opacity ease-in-out 0.25s;
 +
        opacity: 1;
 +
    }
 +
 +
    .this-is-printer .not-selected {
 +
        opacity: 0.08;
 +
    }
 +
 +
    .this-is-printer .selected {
 +
        opacity: 1;
 +
    }
 +
 +
    .this-is-printer .hovered {
 +
        opacity: 0.6;
 +
        cursor: pointer;
 +
    }
 +
 +
    .description {
 +
        font-family: calibri, sans-serif;
 +
    }
 +
 +
    @media only screen and (max-width: 800px) {
 +
        .this-is-printer svg {
 +
            width: 80%;
 +
        }
 +
    }
 +
 +
    .pictures-results {
 +
        display: flex;
 +
        justify-content: center;
 +
        flex-direction: row;
 +
        justify-items: center;
 +
    }
 +
 +
    .pictures-results > figure {
 +
        justify-self: center;
 +
        text-align: center;
 +
        padding: 10px;
 +
    }
 +
 +
    .results-text{
 +
        text-align: center;
 +
    }
 +
 +
    .pictures-results > figure > img {
 +
        width: 90%;
 +
    }
 +
 +
 +
    </style>
 +
    <!-- END -->
 
     <div id="head">
 
     <div id="head">
 
         <div id="title">
 
         <div id="title">
Line 118: Line 130:
 
</html>{{Team:TU_Darmstadt/MainMenu}}<html>
 
</html>{{Team:TU_Darmstadt/MainMenu}}<html>
 
<div class="banner">
 
<div class="banner">
     <img id="banner" src="https://static.igem.org/mediawiki/2016/8/86/T--TU_Darmstadt--team.jpg" alt="teamphoto"></img>
+
     <img id="banner" src="https://static.igem.org/mediawiki/2016/0/08/T--TU_Darmstadt--Team_Robotics2.jpg" alt="teamphoto"></img>
 
</div>
 
</div>
 
<div id="mainHeader">
 
<div id="mainHeader">
Line 127: Line 139:
 
     <div class="abstract">
 
     <div class="abstract">
 
         <p><b>ABSTRACT</b><br/>
 
         <p><b>ABSTRACT</b><br/>
            <b>Our main task was to develop a device that measures fluorescence and adds liquids to samples.
+
          <b>Our main task was to develop a device that measures mVenus fluorescence and adds liquids to sample containers.
                Therefore, our team decided to build a fully automatized pipetting robot that is able to locate a set of samples, detect potential light emission and pipet a specific amount of non-natural amino acid onto the fluorescent sample.<br>
+
          Therefore, our team decided to build a fully automatized pipetting robot that is able to locate a set of samples, detect potential light emission and pipet a specific amount of non-natural amino acid solution into the fluorescent sample.<br>
                The foundation for the robot is a 3D-printer, due to the easy handling of movements in three dimensions. By controlling these movements with an optical system the autonomy of the robot is increased even more.</b></p>
+
            The foundation for the robot is a 3D-printer, due to the easy handling of movements in three dimensions. By controlling these movements with an optical system, the autonomy of the robot is further increased.</b></p>
 +
 
 
     </div>
 
     </div>
  
 
     <div class="content">
 
     <div class="content">
 +
 +
<div class="pictures-results">
 +
       
 +
            <figure>
 +
                <img src="https://static.igem.org/mediawiki/2016/0/0f/T--TU_Darmstadt--robot.jpg" width: 33%; />
 +
            </figure>
 +
            <figure>
 +
                <img src="https://static.igem.org/mediawiki/2016/a/a4/T--TU_Darmstadt--robot1.jpg" width: 33%; />
 +
            </figure>
 +
            <figure>
 +
                <img src="https://static.igem.org/mediawiki/2016/0/0b/T--TU_Darmstadt--robot2.jpg" width: 33%; />
 +
            </figure>
 +
        </div><br>
 +
 +
 
         <div class="verlinked" id="intro"><h5>INTRODUCTION</h5></div>
 
         <div class="verlinked" id="intro"><h5>INTRODUCTION</h5></div>
 
         <p><h6>Development of 3D Printers &amp; Possibilities</h6>
 
         <p><h6>Development of 3D Printers &amp; Possibilities</h6>
        <br>
+
              <br>
        In the 80s Chuck Hull invented the first standardized 3D printer, based on a procedure which is known as stereolithography (SLA, [1]). Moving from SLA to full deposit modeling (FDM) techniques, the 3D printing idea became alive in the do&#8209;it&#8209;yourself community. Ever since that time, basic 3D printers are accessible for little money and due to the open source idea of projects like REPRAP [2] affordable for many. In last years project, iGEM TU Darmstadt has already built a fully working SLA printer, capable of being fed with biologically manufactured plastics [3].<br>
+
              In the 1980s, Chuck Hull invented the first standardized 3D printer, based on a procedure which is known as stereolithography (SLA, <a href="#refe">[1]</a>). Moving from SLA to full deposit modeling (FDM) techniques, the 3D printing idea became alive in the do&#8209;it&#8209;yourself community. Ever since that time, simple 3D printers are accessible for little money and due to the open source idea of projects like REPRAP <a href="#refe">[2]</a> affordable for many. In last years project, iGEM TU Darmstadt has already built a fully working SLA printer, capable of being fed with biologically manufactured plastics <a href="#refe">[3]</a>.<br>
        This year, the robotics team decided to rebuild a clone of the Ultimaker 2 FDM printer [4] and exchange the extruder with a camera and a pipet to create a pipetting robot. Using several open&#8209;source parts and software, it is the idea to establish an easy-to-handle robot to assist the daily biologist's work.
+
              This year, the robotics team decided to rebuild a clone of the Ultimaker 2 FDM printer <a href="#refe">[4]</a> and exchange the extruder with a camera and a pipet to create a pipetting robot. Using several open&#8209;source parts and software, it is the idea to establish an easy-to-handle robot to assist the daily biologist's work.  
        </p>
+
            </p>
        <p>
+
                        <br>
        <h6>Connection to our Team</h6>
+
              iGEM TU DARMSTADT is a young and dynamic team of interdisciplinary and motivated researchers. Our advantage is, that we can bring together synthetic biology and classic engineering sciences, for which TU Darmstadt is famous. We have the possibility, thanks to iGEM, to experiment on our own ideas and to reach for the stars. Being interested in a variety of scientific topics, we wanted to mix up different talents to create a unique project.
        <br>
+
        iGEM TU DARMSTADT is a young and dynamic team of interdisciplinary and motivated researchers. Our advantage is, that we can bring together synthetic biology and classic engineering science, for which TU Darmstadt is famous. We have the possibility, thanks to iGEM, to experiment on our own ideas and to reach for the stars. Interested in a variety of scientific topics, we wanted to mix up different talents to create a unique project.
+
  
        </p>
+
            </p>
  
        <p>
+
 
            References:
+
 
            <br>
+
         
        <ol>
+
            <li><a href="http://edition.cnn.com/2014/02/13/tech/innovation/the-night-i-invented-3d-printing-chuck-hall/">http://edition.cnn.com/2014/02/13/tech/innovation/the-night-i-invented-3d-printing-chuck-hall</a></li>
+
            <li><a href="https://2015.igem.org/Team:TU_Darmstadt/Project/Tech">https://2015.igem.org/Team:TU_Darmstadt/Project/Tech</a></li>
+
            <li><a href="http://reprap.org/wiki/About">http://reprap.org/wiki/About</a></li>
+
            <li><a href="http://www.thingiverse.com/thing:811271">http://www.thingiverse.com/thing:811271, jasonatepaint</a></li>
+
        </ol>
+
        </p>
+
  
 
         <div class="verlinked" id="goals"><h5>GOALS</h5></div>
 
         <div class="verlinked" id="goals"><h5>GOALS</h5></div>
 
         <p>
 
         <p>
            The main task is to develop a machine which is capable to monitor our organisms and their health condition in order to keep them alive. Therefore the machine has to measure the light emission of the organisms and be able of dropping liquids into our containers. This has to be independent of the exact position of the container, which requires an automatic tracking system.<br>
+
              The main task is to develop a machine which is capable to monitor our organisms and their health condition (encoded by fluorescence) in order to keep them alive. Therefore the machine has to measure the light emission of the organisms and needs to be able to drop liquids into sample containers. This has to be independent of the exact position of the container, which requires an automatic tracking system.<br>
            The idea is that one places a container somewhere under the robot's working area and click a run button of a program. The robot starts its routine by tracking the new container and measuring the light emission of the organisms. Based on the measurement the robot decides whether to feed the organisms with non&#8209;natural amino acid or not. After a period of time it repeats this routine until the stop button of the program is clicked.<br>
+
              The idea is that one places a container somewhere under the robot's working area and clicks a <i>run</i> button of a program. The robot starts its routine by tracking the new container and measuring the light emission of the organisms. Based on this measurement, the robot decides whether to feed the organisms with non&#8209;natural amino acid solution or not. After a period of time it repeats this routine until the stop button of the program is clicked.<br>
            These are only the minimum requirements for our project's needs. We decided to go one step further and designed our robot in such a way, that it serves as a multi&#8209;purpose platform which is adaptable and easy to modify. The open&#8209;source character invites other scientists to add new features or improve the robot and its capabilities.<br>
+
              These are only the minimum requirements for our project's needs. We decided to go one step further and designed our robot in such a way, that it serves as a multi&#8209;purpose platform which is adaptable and easy to modify. The open&#8209;source character invites other scientists to add new features or improve the robot and its capabilities.<br>
            For example our liquid system can be upgraded to be able to prepare 96-well plates with samples and monitor routines by using the optical system.
+
              For example our dispensing system can be upgraded to be able to prepare 96-well plates with samples and monitor routines by using the optical system.
            Or our measuring head can be changed back to a printer head which allows to 3D print again with just a few changes.<br>
+
              Additionally, our measuring head can be changed back to a printer head which allows to 3D print with just a few changes.<br>
            There is a vast room of possibilities, just using the concept of the accurate positioning of a sample in the 3D space.<br>
+
              There is a vast room of possibilities, just using the concept of the accurate positioning of a sample in the 3D space.<br>
            <!--TODO: Overthink this text-->
+
<!--TODO: Overthink this text. This text was overthought by chris. it is kept. niceeee text.-->
            Due to the fact that we try to stick to widely used open-source software and standard commercial parts, our machine can be easily combined with the most DIY products, making it reusable, flexible and cheap.<br>
+
              Due to the fact that we try to stick to widely used open-source software and standard commercial parts, our machine can be easily combined with the most DIY products, making it reusable, flexible and cheap.<br>
            In the special case of the TU Darmstadt and the next generations of iGEM competitors we have the idea to develop our technical equipment further from year to year and, if possible, combining them. Our SLA printer from last year’s competition was upgraded and is nearly ready for use again, giving us the possibility to manufacture parts for prototyping in our lab. Also this year’s project will serve as a starting point for the next year’s technical development team. New ideas and possibilities have been already discussed and we are looking forward to the next year’s competition.<br>
+
              In the special case of the TU Darmstadt and the next generations of iGEM competitors, we had the idea to develop our technical equipment from year to year and, if possible, combine them. Our SLA printer from last year’s competition was upgraded and is nearly ready to use again, giving us the possibility to manufacture parts for prototyping in our lab. Also this year’s project will serve as starting point for the next year’s technical development team. New ideas and possibilities have been already discussed and we are looking forward to the next year’s competition.<br>
        </p>
+
            </p>
 +
 
  
 
         <div class="verlinked" id="setup"><h5>SETUP OVERVIEW</h5></div>
 
         <div class="verlinked" id="setup"><h5>SETUP OVERVIEW</h5></div>
Line 959: Line 979:
 
             <li data-part="default">To get short information about our robot's single parts, click on the part of interest.</li>
 
             <li data-part="default">To get short information about our robot's single parts, click on the part of interest.</li>
 
             <li data-part="x-y-axis">
 
             <li data-part="x-y-axis">
                 To realize the movement of the head in the x-y-plane the head is mounted onto a 3D-printed connection which is fitted between two crossing 6&nbsp;mm aluminum linear rods. To ensure a smooth sliding of the connection on the linear rods two <i>LM6UU</i> linear bearings were applied. Also, the crossing 6&nbsp;mm aluminum linear rods are clipped on four <i>UM2 Ultimaker 2 Injection Sliding Blocks</i> which are themselves sliding on 8&nbsp;mm steel rods. Two <i>F688ZZ</i> flanged ball bearings per 8&nbsp;mm steel rod are plugged into <i>SK16</i> rod supports ensure that they are able to rotate with less friction. Furthermore, <i>GT2</i> timing belts transfer the stepper motor's rotation into a linear movement of the head. These timing belts connect the stepper motors with the 8&nbsp;mm steel rods with the usage of <i>GT2</i> pulleys. Timing belts that are fixed in the sliding blocks range from one linear rod on one side of the robot to a parallel one on the opposing side. Therefore, an amount of two timing belts per axis and one timing belt per motor are a necessity for a proper movement in the x&#8209;y&#8209;plane.
+
                 To realize the movement of the head in the x-y-plane, the head is mounted onto a 3D-printed connection element which is fitted between two crossing 6&nbsp;mm aluminum linear rods. To ensure a smooth sliding of the connection on the linear rods, two <i>LM6UU</i> linear bearings were applied. Also, the crossing 6&nbsp;mm aluminum linear rods are clipped on four <i>UM2 Ultimaker 2 Injection Sliding Blocks</i> which are themselves sliding on 8&nbsp;mm steel rods. Two <i>F688ZZ</i> flanged ball bearings per 8&nbsp;mm steel rod are plugged into <i>SK16</i> rod supports ensure that they are able to rotate with less friction. Furthermore, <i>GT2</i> timing belts transfer the stepper motor's rotation into a linear movement of the head. These timing belts connect the stepper motors with the 8&nbsp;mm steel rods with the usage of <i>GT2</i> pulleys. Timing belts that are fixed in the sliding blocks range from one linear rod on one side of the robot to a parallel one on the opposing side. Therefore, an amount of two timing belts per axis and one timing belt per motor are necessary for a proper movement in the x&#8209;y&#8209;plane.
 
             </li>
 
             </li>
 +
 
             <li data-part="z-axis">
 
             <li data-part="z-axis">
                 For the motion in z-direction a threaded rod is coupled to a stepper motor with a <i>5&#8029;to&#8209;8
+
                 For the motion in z-direction, a threaded rod is coupled to a stepper motor with a <i>5&#8209;to&#8209;8
                 &nbsp;mm</i> shaft coupling and vertically mounted with two <i>KP08</i> pillow blocks. The rotation of the threaded rod is then transmitted into a vertical movement of the sample stage via a ball screw. For a greater stability and a more balanced force distribution two additional 12&nbsp;mm aluminum rods help to guide the vertical movement of the sample stage. Besides, one <i>LM12LUU</i> linear bearing per vertical axis is used for nearly frictionless motion and an even more balanced force effect on the vertical axis. The mounting for the latter is attached on the sample stage and particularly manufactured, for again a balanced force distribution. The sample stage itself consists of an aluminum framework with an infrared&#8209;transparent Plexiglas. To minimize the weight of the sample plate, while keeping its stability, the thickness of the table's components was chosen to be 3&nbsp;mm.
+
                 &nbsp;mm</i> shaft coupling and vertically mounted with two <i>KP08</i> pillow blocks. The rotation of the threaded rod is then transmitted into a vertical movement of the sample stage via a ball screw. For a greater stability and a more balanced force distribution, two additional 12&nbsp;mm aluminum rods help to guide the vertical movement of the sample stage. Besides, one <i>LM12LUU</i> linear bearing per vertical axis is used for nearly frictionless motion and an even more balanced force effect on the vertical axis. The mounting for the latter is attached on the sample stage and particularly manufactured, again for a balanced force distribution. The sample stage itself consists of an aluminum framework with an infrared&#8209;transparent Plexiglas. To minimize the weight of the sample plate, while keeping its stability, the thickness of the table's components was chosen to be 3&nbsp;mm.
 
             </li>
 
             </li>
            <li data-part="optics">
+
<li data-part="optics">
                 The main part of our optics is the combination of our lightbox, LEDs and <i>Pi NoIR v2</i> camera. With these parts we detect flouroscent light, after stimulated emission.
+
                 The main part of our optics is the combination of our lightbox, LEDs and a <i>Pi NoIR v2</i> camera. With these parts we detect flourescent light, after stimulated emission.
                 The localization works with the lightbox via built&#8029;in infrared LED's whose light spreads inside a diffusion plate, generating an evenly lightsource. The then emerging shadows can be detected with the camera. The latter is attached to the head of the robot that additionally consists of optical components such as four high power LED's, various filters, a lens mounting and a tip that is connected to the syringe pump. As already mentioned the camera is in charge of detecting different sample locations and fluorescent light of the mVenus protein. While the high power LED's ensure the excitation of the protein the filters, that are composed of a longpass-filter and sunglasses, eliminate reflected light such that the camera only detects relevant signals. The lens mounting works as an autofocus that's able to control the sharpness of the camera. This works by applying different voltages to coils that move the lens inside a modified webcam.
+
                 The localization works with the lightbox via built&#8209;in infrared LEDs whose light spreads inside a diffusion plate, generating an evenly distributed light source. Emerging shadows can be detected with the camera. The latter is attached to the head of the robot that additionally consists of optical components such as four high power LEDs, various filters, a lens mounting and a tip that is connected to the syringe pump. As already mentioned, the camera is in charge of detecting different sample locations and fluorescent light of the mVenus protein. While the high power LEDs ensure the excitation of the protein; the filters, that are composed of a longpass-filter and sunglasses, eliminate reflected light such that the camera only detects relevant signals. The lens mounting works as an autofocus that is able to control the sharpness of the camera. This works by applying different voltages to coils that move the lens inside a modified webcam.
 
             </li>
 
             </li>
 +
 
             <li data-part="syringepump">
 
             <li data-part="syringepump">
                 The syringe pump is the device of the robot that feeds the bacteria with non-natural amino acid if it is needed. Its setup is a stepper motor that squeezes the non-natural amino acid out of the syringe and a 3D-printed framework that holds the syringe. It also includes a threaded rod for transmission between the stepper motor and the syringe. To hold the whole setup at a fixed position and still pipet on the right sample the end of the syringe is connected with the head by a flexible tube.
+
                 The syringe pump is the device of the robot that feeds the bacteria with non-natural amino acid if it is necessary. Its setup is a stepper motor that squeezes the non-natural amino acid out of the syringe and a 3D-printed framework that holds the syringe. It also includes a threaded rod for transmission between the stepper motor and the syringe. To keep the whole setup at a fixed position and still pipet on the right sample, the end of the syringe is connected with the head by a flexible tube.
 
             </li>
 
             </li>
 +
 
             <li data-part="chassis">
 
             <li data-part="chassis">
 
                 The framework for the robot is built in a form of a simple cuboid constructed with 30&nbsp;mm&nbsp;x&nbsp;30&nbsp;mm aluminum profiles. These profiles are connected with each other via M8 screws and drilled into ISO metric screw threads. Two extra horizontal aluminum profiles are added on the backside which hold the axes and the threaded rod needed for the vertical motion. Moreover, a thin aluminum frame is attached at the lower part of the 30&nbsp;mm framework to mount the lightbox onto it.
 
                 The framework for the robot is built in a form of a simple cuboid constructed with 30&nbsp;mm&nbsp;x&nbsp;30&nbsp;mm aluminum profiles. These profiles are connected with each other via M8 screws and drilled into ISO metric screw threads. Two extra horizontal aluminum profiles are added on the backside which hold the axes and the threaded rod needed for the vertical motion. Moreover, a thin aluminum frame is attached at the lower part of the 30&nbsp;mm framework to mount the lightbox onto it.
 
             </li>
 
             </li>
 
             <li data-part="electronics">
 
             <li data-part="electronics">
                 The electrical parts used in this robot can be separated into those that directly manage the threedimensional movements and those that are in charge of the optical localization and detection. The threedimensional movements are controlled by an <i>Arduino Mega 2560</i> microcontroller connected to a <i>RepRap Arduino Mega Pololu Shield 1.4</i> (short: <i>RAMPS 1.4</i>) which provides an intuitive connection of drivers and other electronical devices without using wires. The drivers used in this project are called <i>Pololu - DRV8825 Stepper Motor Driver Carrier</i> and allow the Arduino to let the stepper motors make a 1/32 step and therefore increases the spatial resolution. Additional parts involved in the threedimensional movement are the so called endstops that assure that the head won't crash into the boundaries of the robot. The list of parts that control the optical localization and detection comprise a single&#8209;board computer <i>Raspberry Pi 3</i> and controls several other electro&#8209;optical components. All needed power is delivered by a standard<!--, baptized in coffee,--> <i>ATX</i> power supply.        </li>
+
                 The electrical parts used in this robot can be separated into those that directly manage the threedimensional movements and those that are in charge of the optical localization and detection. The threedimensional movements are controlled by an <i>Arduino Mega 2560</i> microcontroller connected to a <i>RepRap Arduino Mega Pololu Shield 1.4</i> (short: <i>RAMPS 1.4</i>) which provides an intuitive connection of drivers and other electronical devices without using wires. The drivers used in this project are called <i>Pololu - DRV8825 Stepper Motor Driver Carrier</i> and allow the Arduino to let the stepper motors make a 1/32 step and therefore increases the spatial resolution. Additional parts involved in threedimensional movement are the so called endstops that assure that the head does not crash into the boundaries of the robot. The list of parts that control the optical localization and detection are comprised of a single&#8209;board computer <i>Raspberry Pi 3</i> and controls several other electro&#8209;optical components. All necessary power is delivered by a standard<!--, baptized in coffee,--> <i>ATX</i> power supply.        </li>
 +
 
 
         </ul>
 
         </ul>
  
         <div class="verlinked" id="func"><h5>FUNCTIONALITY</h5></div>
+
         <p><div class="verlinked" id="func"><h5>FUNCTIONALITY</h5></div>
         The functionality of the pipetting robot comprises the three&#8209;dimensional agility of a 3D-printer and the possibility to pipet a specific amount of non&#8209;natural amino acid using a syringe pump. Also it has intelligent visual object recognition so that it is able to distinguish between samples that require more non&#8209;natural amino acid from samples that still contain a sufficient amount.
+
         The functionality of the pipetting robot comprises of the three&#8209;dimensional agility of a 3D-printer and the possibility to pipet a specific amount of non&#8209;natural amino acid using a syringe pump. Also it has intelligent visual object recognition so that it is able to distinguish between samples that require more non&#8209;natural amino acid from samples that still contain a sufficient amount.
 
         With that said it is capable to autonomously keep alive the modified <i>E.&nbsp;coli</i> bacteria, given that it is activated and connected to a reliable power supply.<br>
 
         With that said it is capable to autonomously keep alive the modified <i>E.&nbsp;coli</i> bacteria, given that it is activated and connected to a reliable power supply.<br>
         To fulfill the task of keeping the bacteria alive it loops through a specially designed procedure. Initially the robot scans the working area for samples by illuminating the downside of the sample stage using infrared LEDs and monitoring the shadows of the placed reservoirs with a camera. If the contrast is sufficiently high it is able to detect the edges of the mentioned reservoirs, fit a circle onto it and compute the distance between the reservoir and the camera itself. Furthermore it is possible to put an entire rack of reservoirs under observation due to its ability to locate every individual reservoir.<br>
+
         To fulfil the task of keeping the bacteria alive it loops through a specially designed procedure. Initially, the robot scans the working area for samples by illuminating the downside of the sample stage using infrared LEDs and monitoring the shadows of the placed reservoirs with a camera. If the contrast is sufficiently high it is able to detect the edges of the mentioned reservoirs, fit a circle onto it and compute the distance between the reservoir and the camera itself. Furthermore it is possible to put an entire rack of reservoirs under observation due to its ability to locate every individual reservoir.<br>
         Shortly after the detection the distance information is sent to the 3D control program and the head of the robot moves in direction of the first reservoir. To check whether the bacteria needs more non&#8209;natural amino acid the robot uses the fluorescence of the protein mVenus that has been inserted into the bacteria. Therefore the robot excites the protein via high power LEDs and detects the emitted light. To exclude reflected light from the LEDs that would disturb the measurement a longpass filter cuts off the spectrum below the emission peak of the protein. In dependence of the fluorescence signal the robot decides whether it is necessary to pipet non&#8209;natural amino acid onto the sample. If that is the case the robot moves the samples in z-range just so that the syringe reaches the sample and is able to securely add the non&#8209;natural amino acid.<br>
+
         Shortly after the detection, the distance information is sent to the 3D control program and the head of the robot moves in direction of the first reservoir. To check whether the bacteria needs more non&#8209;natural amino acid the robot uses the fluorescence of the protein mVenus that is expressed by the bacteria. Therefore the robot excites the protein via high power LEDs and detects the emitted light. To exclude reflected light from the LEDs that would interfere with the measurement a longpass filter cuts off the spectrum below the emission peak of the protein. In dependence of the fluorescence signal, the robot decides whether it is necessary to pipet non&#8209;natural amino acid onto the sample. If that is the case the robot moves the samples in z-range just so that the syringe reaches the sample and is able to securely add the non&#8209;natural amino acid.<br>
         Eventually the robot recommences the procedure described above, except for the scanning of the individual positions of the samples, which are saved temporarily until all samples are checked. As long as the robot is activated, connected to a power supply and the syringe pump does not run out on non&#8209;natural amino acid, the robot will loop through this whole process and keep the bacteria alive without a need of human interaction.
+
         Eventually the robot recommences the procedure described above, except for the scanning of the individual positions of the samples, which are saved temporarily until all samples are checked. As long as the robot is activated, connected to a power supply and the syringe pump does not run out on non&#8209;natural amino acid, the robot will loop through this whole process and keep the bacteria alive without the need of a human.
 
         Nevertheless it is possible to check what the robot is doing via a livestream of the camera visible on a graphical user interface, since there is no other opportunity to look inside the robot itself while it is working.
 
         Nevertheless it is possible to check what the robot is doing via a livestream of the camera visible on a graphical user interface, since there is no other opportunity to look inside the robot itself while it is working.
         <br>
+
         <br></p>
 +
 
  
 
         <div class="verlinked" id="achie"><h5>ACHIEVEMENTS</h5></div>
 
         <div class="verlinked" id="achie"><h5>ACHIEVEMENTS</h5></div>
Line 1,003: Line 1,028:
  
 
         <div class="verlinked" id="results"><h5>RESULTS</h5>
 
         <div class="verlinked" id="results"><h5>RESULTS</h5>
 +
 +
       
 +
        <h6>Circle Detection</h6><br>
 +
        <center>
 +
        </html>{{Team:TU_Darmstadt/Video|src=<html>https://static.igem.org/mediawiki/2016/0/07/T--TU_Darmstadt--Auto_tracking.mov</html>}}<html>
 +
        </center>
 +
        <p>The sample detection was facing two problems which on the first glance seem rather simple. Detect a circle and detect a rectangle. The detection of a circle is more easy due to the fact that it is a analytical function. This was already implemented by openCV and we were able to use the circle detection for our sample tracking system.
 +
The green circles are the found objects. The red lines are vectors from the central position of our probe to the sample positions. They are used to drive the stepper motors so that our probe can move to them. You can also see, that changing the probes position are recognized and the vectors are recalculated.<br></p>
 +
       
 +
 +
        <h6>Multi-Object Detection</h6><br>
 
         <div class="pictures-results">
 
         <div class="pictures-results">
 +
       
 
             <figure>
 
             <figure>
 
                 <img src="https://static.igem.org/mediawiki/2016/8/89/T--TU_Darmstadt--sample_detection_first_shot.jpg" width: 33%; />
 
                 <img src="https://static.igem.org/mediawiki/2016/8/89/T--TU_Darmstadt--sample_detection_first_shot.jpg" width: 33%; />
                <figcaption>Hier text1</figcaption>
 
 
             </figure>
 
             </figure>
 
             <figure>
 
             <figure>
 
                 <img src="https://static.igem.org/mediawiki/2016/c/c8/T--TU_Darmstadt--sample_detection_improvement.jpg" width: 33%; />
 
                 <img src="https://static.igem.org/mediawiki/2016/c/c8/T--TU_Darmstadt--sample_detection_improvement.jpg" width: 33%; />
                <figcaption>Hier text1</figcaption>
 
 
             </figure>
 
             </figure>
 
             <figure>
 
             <figure>
 
                 <img src="https://static.igem.org/mediawiki/2016/5/5c/T--TU_Darmstadt--sample_detection_sugar.jpg" width: 33%; />
 
                 <img src="https://static.igem.org/mediawiki/2016/5/5c/T--TU_Darmstadt--sample_detection_sugar.jpg" width: 33%; />
                <figcaption>Hier text1</figcaption>
 
 
             </figure>
 
             </figure>
 
         </div>
 
         </div>
 +
 +
 +
 +
            <p>
 +
                A more delicate task was to detect round samples, stored in a rack. The difficulty is that the objects are intersecting and the algorithm needs to distinguish them somehow. You can see how the camera sees the whole object. Even with our eye it is difficult to see a rectangle. We improved the algorithm step by step using prominent features of the rack. The result is shown in the last picture.
 +
            </p>
 
         </div>
 
         </div>
        <p>
 
  
 +
       
 +
        <h6>Rectangle Detection</h6><br>
 +
        <center>
 +
        </html>{{Team:TU_Darmstadt/Video|src=<html>https://static.igem.org/mediawiki/2016/3/32/T--TU_Darmstadt--sample_tracking.mp4</html>}}<html></center>
 +
<p>A rectangle detection is more complicated, but we were able to solve this challenge. You can see how the algorithm detect the rectangles and mark them, also tracking the single samples.<br></p>
  
 +
       
 +
<h6>Full Functionality</h6><br>
 +
        <center>
 +
        </html>{{Team:TU_Darmstadt/Video|src=<html>https://static.igem.org/mediawiki/2016/6/67/T--TU_Darmstadt--finalfunct.mp4</html>}}<html></center>
 +
<p>As shown in the video our robot starts with its routine and going from sample to sample, checking for light emission. For demonstration purposes we installed a LED light. One can see, that the optic detect the LED light and then moves down to dispense liquid into the sample, because our pipette is installed sideways on the optical head. Then its moving back to its original position and continues its routine. At the end of the video we show the possible adjustments like brightness and focus.</p>
 +
<center>
 +
<video width="600" controls>
 +
  <source src=https://static.igem.org/mediawiki/2016/8/88/T--TU_Darmstadt--rectang.mp4 type="video/mp4">
 +
  Your browser does not support HTML5 video.
 +
</video>
 +
</center>
 +
<p>In this video we additionally demonstrate the recognition of rectangles and circles. Using OpenCV and the Raspberry Pi Cam we can detect our glowing samples, here simulated with a green LED. After the detection of the glowing, in the case of a fluorescent protein induced by the blue high power LEDs, we pipet a drop of the respective substance (in our case nnAA) into the sample.</p>
  
 +
       
 +
<div class="verlinked" id="develop"><h5>FURTHER DEVELOPMENTS</h5></div>
 +
        <p>Due to a tight time schedule from the start to the end of iGEM it was not possible for us to realize all ideas and planned developments in respect of improvement of the robot itself.
 +
            For a working process with more kinds of bacteria cultures it is absolutely indispensable to develop a system that is able to avoid all sorts of contamination between the different bacteria. Therefore it would be an option to have an extra reservoir filled with ethanol in which the tip of the syringe can be sterilized between the checks of different samples.
 +
            Another modification that would be useful for working with individual bacteria cultures is making the power LED's changeable. This is necessary if the the wavelength of the LEDs does not overlap with the absorption spectrum of the fluorescent proteins or overlaps with a part of the spectrum that has a very low absorption efficiency.
 +
            <br>Moreover, apart from the latter developments it may be useful to improve the syringe pump system. Instead of using a syringe pump it would be useful to use a system with a reservoir of liquids and a pump that works continously like a turbine, for example see <a href="http://www.ardulink.org/automatic-lipid-dispensing/">http://www.ardulink.org/automatic-lipid-dispensing/</a>.
 +
            <br>Another useful modification of the robot would be to rebuild its foundation, namely an <i>Ultimaker 3D-printer</i> setup. Essential alterations would be to replace the sample stage with a heatbed and to replace the current head with a printhead hotend. Since the current head can be clipped it would not be too much of a challenge. Furthermore, a change of the syringe extruder is necessary, if the printer should work with plastics.
 +
            An alternative approach is a kind of paste 3D-printer. In this case it wouldn't even be needed to change the head and the syringe, because of the already viscous properties of the paste.
 
         </p>
 
         </p>
  
Line 1,057: Line 1,121:
 
         The determination of ROIs occurs simultaneously with the detection of samples.
 
         The determination of ROIs occurs simultaneously with the detection of samples.
 
         The filtering is very strong, and does indeed block some of the already weak mVenus fluorescence.
 
         The filtering is very strong, and does indeed block some of the already weak mVenus fluorescence.
         This is why we use an exposure time of 10 seconds for a fluorescence capture, to make sure enough data is collected.</p><p>
+
         This is why we use an exposure time of 10 seconds for a fluorescence capture, to make sure enough data is collected.
 +
<div class="pictures-results">
 +
       
 +
            <figure>
 +
                <img src="https://static.igem.org/mediawiki/2016/9/91/T--TU_Darmstadt--filter.jpg" width="330" height="550"; />
 +
            </figure>
 +
        </div>
 +
</p><p>
  
 
         <div class="verlinked" id="cam"><h6>Optical Hardware - Camera Head</h6></div>
 
         <div class="verlinked" id="cam"><h6>Optical Hardware - Camera Head</h6></div>
 
         We are using the 8 megapixels PiCamera, because we have access to its capturing settings like framerate,
 
         We are using the 8 megapixels PiCamera, because we have access to its capturing settings like framerate,
         exposure time, gains, light sensitivity etc. over an existing programming interface [1].
+
         exposure time, gains, light sensitivity etc. over an existing programming interface <a href="#refe">[5]</a>.
 
         This is absolutely necessary since detection and measurement have totally different requirements.
 
         This is absolutely necessary since detection and measurement have totally different requirements.
 
         Another benefit is, that we are able to capture directly in grayscale (Y part of YUV) for fast detection purposes, and switch to RGB when doing
 
         Another benefit is, that we are able to capture directly in grayscale (Y part of YUV) for fast detection purposes, and switch to RGB when doing
Line 1,079: Line 1,150:
 
         The camera head also includes the stimulating LEDs, which are 4 high Power Cree XTE, driven by a 0.9 amp
 
         The camera head also includes the stimulating LEDs, which are 4 high Power Cree XTE, driven by a 0.9 amp
 
         current source and a PWM signal delivered by the Raspberry Pi.
 
         current source and a PWM signal delivered by the Raspberry Pi.
 +
        <br>
 +
<div class="pictures-results">
 +
            <figure>
 +
                <img src="https://static.igem.org/mediawiki/2016/7/72/T--TU_Darmstadt--reinraum1.jpg" width: 50% height="320"  />
 +
            </figure>
 +
            <figure>
 +
                <img src="https://static.igem.org/mediawiki/2016/4/48/T--TU_Darmstadt--reinraum2.jpg" width= 50% height="320" />
 +
            </figure>
 +
 +
        </div>
 +
<br>
 +
We mounted the self-made autofocus in a cleanroom at <a href="http://www.idd.tu-darmstadt.de/idd/aktuelles/index.de.jsp">"Institut für Druckmaschinen und Druckverfahren" TU Darmstadt</a>. This was to keep the lense dustless.
 +
We are thankful to experiment under this conditions, but for the autofocus it is not mandatory. Be just aware of a clean dustless environment!
 
         <br>
 
         <br>
 
         <div class="verlinked" id="lightbox"><h6>Optical Hardware - Lightbox</h6></div>
 
         <div class="verlinked" id="lightbox"><h6>Optical Hardware - Lightbox</h6></div>
Line 1,088: Line 1,172:
 
         allowing us to use less strict thresholds. It also emphasizes the samples' corners and enhances the detection reliability.
 
         allowing us to use less strict thresholds. It also emphasizes the samples' corners and enhances the detection reliability.
 
         The heart of the lightbox is a Plexiglas panel called “Endlighten”.
 
         The heart of the lightbox is a Plexiglas panel called “Endlighten”.
         Light is laterally injected and reflected off systematic impurities inside the panel [2].
+
         Light is laterally injected and reflected off systematic impurities inside the panel <a href="#refe">[6]</a>.
 
         The light then leaves the panel uniformly in all directions.
 
         The light then leaves the panel uniformly in all directions.
 
         Light leaving the panel back side is mirrored to the front side by a white reflective Plexiglas,
 
         Light leaving the panel back side is mirrored to the front side by a white reflective Plexiglas,
Line 1,095: Line 1,179:
 
         Endlighten panel. The LEDs are driven by constant current sources to give them a long lifetime.
 
         Endlighten panel. The LEDs are driven by constant current sources to give them a long lifetime.
 
         </p>
 
         </p>
         <p>References<br>
+
          
            [1]: <a href="https://picamera.readthedocs.io/en/release-1.12/">https://picamera.readthedocs.io/en/release-1.12/</a><br>
+
            [2]: <a href="https://www.plexiglas-shop.com/pdfs/en/212-15-PLEXIGLAS-LED-edge-lighting-en.pdf">https://www.plexiglas-shop.com/pdfs/en/212-15-PLEXIGLAS-LED-edge-lighting-en.pdf</a>
+
        </p>
+
  
 
         <div class="verlinked" id="cool"><h5>Cooling</h5></div>
 
         <div class="verlinked" id="cool"><h5>Cooling</h5></div>
Line 1,141: Line 1,222:
  
 
         <div class="verlinked" id="pyqt"><h6>PyQt</h6></div>
 
         <div class="verlinked" id="pyqt"><h6>PyQt</h6></div>
         <p>Qt is a software tool to develop a GUI (Graphical User Interface). It is available under a commercial and an open-source license. The software is a cross-platform application framework, which means it runs on the most computer systems like Unix or Windows. The underlying programming language is C++ and Qt can use already existing programming languages like Javascript, making it a powerful tool.<br>
+
         <p>Qt is a software tool to develop a GUI (Graphical User Interface)<a href="#refe">[7]</a>. It is available under a commercial and an open-source license. The software is a cross-platform application framework, which means it runs on the most computer systems like Unix or Windows. The underlying programming language is C++ and Qt can use already existing programming languages like Javascript, making it a powerful tool.<br>
 
             The main idea of Qt is to use a system of signals and slots to have an easy framework to connect displayed elements with underlying functions. Also, the reusability of already existing code is enhanced. Every graphical element, for example a button, emits its own signal when it is pressed or used. The signal then can be used to trigger an action, like closing a window. If the signal is not connected to a function nothing will happen, however the signal will be emitted with no consequences. Now it is possible to connect the emitted signal with a desired action, called slot, and the program gets its specific behavior.<br>
 
             The main idea of Qt is to use a system of signals and slots to have an easy framework to connect displayed elements with underlying functions. Also, the reusability of already existing code is enhanced. Every graphical element, for example a button, emits its own signal when it is pressed or used. The signal then can be used to trigger an action, like closing a window. If the signal is not connected to a function nothing will happen, however the signal will be emitted with no consequences. Now it is possible to connect the emitted signal with a desired action, called slot, and the program gets its specific behavior.<br>
 
             Qt is widely used by companies like the European Space Agency (ESA), Samsung, DreamWorks, Volvo and many more.
 
             Qt is widely used by companies like the European Space Agency (ESA), Samsung, DreamWorks, Volvo and many more.
Line 1,148: Line 1,229:
 
             After converting the code one can open the GUI as a regular python script and work with it as usual. </p>
 
             After converting the code one can open the GUI as a regular python script and work with it as usual. </p>
  
        <p>
 
            References:<br>
 
            <a href="https://riverbankcomputing.com/software/pyqt/intro">https://riverbankcomputing.com/software/pyqt/intro <br></a>
 
            <a href="https://www.qt.io/">https://www.qt.io/</a><br>
 
            <a href="https://en.wikipedia.org/wiki/Qt_(software)">https://en.wikipedia.org/wiki/Qt_(software)</a><br>
 
        </p>
 
  
        <div class="verlinked" id="develop"><h5>FURTHER DEVELOPMENTS</h5></div>
+
              
        <p>Due to a tight time schedule from the start to the end of iGEM it was not possible for us to realize all ideas and planned developments in respect of improvement of the robot itself.
+
     
            For a working process with more kinds of bacteria cultures it is absolutely indispensable to develop a system that is able to avoid all sorts of contamination between the different bacteria. Therefore it would be an option to have an extra reservoir filled with ethanol in which the tip of the syringe can be sterilized between the checks of different samples.
+
            Another modification that would be useful for working with individual bacteria cultures is making the power LED's changeable. This is necessary if the the wavelength of the LEDs does not overlap with the absorption spectrum of the fluorescent proteins or overlaps with a part of the spectrum that has a very low absorption efficiency.
+
            <br>Moreover, apart from the latter developments it may be useful to improve the syringe pump system. Instead of using a syringe pump it would be useful to use a system with a reservoir of liquids and a pump that works continously like a turbine, for example see <a href="http://www.ardulink.org/automatic-lipid-dispensing/">http://www.ardulink.org/automatic-lipid-dispensing/</a>.
+
            <br>Another useful modification of the robot would be to rebuild its foundation, namely an <i>Ultimaker 3D-printer</i> setup. Essential alterations would be to replace the sample stage with a heatbed and to replace the current head with a printhead hotend. Since the current head can be clipped it would not be too much of a challenge. Furthermore, a change of the syringe extruder is necessary, if the printer should work with plastics.
+
             An alternative approach is a kind of paste 3D-printer. In this case it wouldn't even be needed to change the head and the syringe, because of the already viscous properties of the paste.
+
        </p>
+
  
 +
       
  
  
Line 1,173: Line 1,243:
 
         <center>
 
         <center>
 
</html>{{Team:TU_Darmstadt/Video|src=<html>https://static.igem.org/mediawiki/2016/f/f3/T--TU_Darmstadt--construction.mp4</html>}}<html>
 
</html>{{Team:TU_Darmstadt/Video|src=<html>https://static.igem.org/mediawiki/2016/f/f3/T--TU_Darmstadt--construction.mp4</html>}}<html>
</center>
+
</center><br>
 +
 
 +
<div class="verlinked" id="circ"><h6>Circuit Diagram</h6></div><br>
 +
<center><img src="https://static.igem.org/mediawiki/2016/b/b0/T--TU_Darmstadt--circuit_diagram.svg" width="95%"/></center>
 +
<p>
 +
The stepper motors for X and Y are plugged into the RAMPS board as shown in the graphic. There are already prepared cables available with corresponding connectors. To control and power them the stepper driver (A4988) are plugged in the RAMPS slots, marked with X and Y. On the bottom of the stepper driver it is marked which pin is ground and so on. Also on the RAMPS board it is marked where the corresponding slots for ground etc. are. The X and Y stepper motors have one endstop each to provide a stop signal for the homing routine. It is also possible for safety reasons to add one more endstop to each axis, so that the movement stops for sure at the end positons. All the endstops are plugged into the marked position on the RAMPS board. From left to right the pins are for X min and X max position, Y min and Y max position and Z min and Z max position. The pin marked with a + is reserved for the 5 V, the middle pin is the ground and the last pin marked with a S is the signal.
 +
The Z stepper motor has two endstops and is connected like the X and Y stepper motors and endstops.
 +
To drive our syringe pump one more stepper motor and one endstop is required and is plugged into the E0 position. It is connected like X,Y and Z.<br>
 +
To power the RAMPS board 12 V are needed, for example from an ATX power supply.
 +
The Raspberry Pi3 is responsible for controlling the optics and LEDs whereas the Arduino controls the stepper motors and endstops. Arduino and Raspberry Pi are connected via one of the Pi’s USB port and the USB Type B input of the Arduino.
 +
The lightbox is powered with 24 V from the ATX power supply. The lightbox consists of 4 LED bars, each carrying 10 infrared LEDs in series. Each bar is driven by one 50mA constant current source. These are parallel connected to the -12V of the power supply, and to the NC of one Relay. The COM port of that Relais is then connected to the +12V of the power supply.
 +
The 2 Relay Modul serves as a Raspberry Pi controlled switch which controls the lightbox and the blue exposing LEDs. The corresponding wire’s number refer to the used GPIO pins. Be careful, the internal GPIO layout is different as the numbering on the Pi layout scheme. All numbers correspond to the latter!<br>
 +
The L298N are DC motor drivers which we alienated as protection circuit, to separate the 12 V from the Raspberry Pi’s GPIO.
 +
The blue exposing LEDs are series-connected, and controlled by the 2 Relay Modul on the first side.
 +
</p>
 +
<div class="verlinked" id="zips"><h6>Required Software</h6></div>
 +
<p>In case you want to rebuild ou project, you can download all needed software in this section. The software includes the exact <a href="https://static.igem.org/mediawiki/2016/3/38/T--TU_Darmstadt--Marlin.zip">marlin firmware</a>, which we uploaded on the Arduino, as well as the software that is installed on the <a href="https://static.igem.org/mediawiki/2016/8/85/T--TU_Darmstadt--Raspberry.zip">Raspberry Pi</a> and some improved <a href="https://static.igem.org/mediawiki/2016/a/af/T--TU_Darmstadt--python.zip">python scripts</a>. Furthermore, to enable wireless support, install and configure "hostapd" and "dnsmasq" on your Raspberry Pi. We followed this external <a href="https://frillip.com/using-your-raspberry-pi-3-as-a-wifi-access-point-with-hostapd/">tutorial</a>.<br></p>
  
 
<div class="verlinked" id="bom"><h6>Bill Of Materials</h6></div>
 
<div class="verlinked" id="bom"><h6>Bill Of Materials</h6></div>
Line 1,180: Line 1,266:
 
Nevertheless, if you are still interested in constructing this robot, you can take a look at the bill of materials <a href="https://static.igem.org/mediawiki/2016/6/64/T--TU_Darmstadt--BOM.pdf">here</a>.
 
Nevertheless, if you are still interested in constructing this robot, you can take a look at the bill of materials <a href="https://static.igem.org/mediawiki/2016/6/64/T--TU_Darmstadt--BOM.pdf">here</a>.
 
</div>
 
</div>
 +
<div class="verlinked" id="refe">
 +
<div class="references"><h6>References</h6>
 +
<ul><li>[1] <a href="http://edition.cnn.com/2014/02/13/tech/innovation/the-night-i-invented-3d-printing-chuck-hall/">http://edition.cnn.com/2014/02/13/tech/innovation/the-night-i-invented-3d-printing-chuck-hall</a>
 +
</li><li>[2] <a href="https://2015.igem.org/Team:TU_Darmstadt/Project/Tech">https://2015.igem.org/Team:TU_Darmstadt/Project/Tech</a>
 +
</li><li>[3] <a href="http://reprap.org/wiki/About">http://reprap.org/wiki/About</a>
 +
</li><li>[4] <a href="http://www.thingiverse.com/thing:811271">http://www.thingiverse.com/thing:811271, jasonatepaint</a>
 +
</li><li>[5] <a href="https://picamera.readthedocs.io/en/release-1.12/">https://picamera.readthedocs.io/en/release-1.12/</a>
 +
</li><li>[6] <a href="https://www.plexiglas-shop.com/pdfs/en/212-15-PLEXIGLAS-LED-edge-lighting-en.pdf">https://www.plexiglas-shop.com/pdfs/en/212-15-PLEXIGLAS-LED-edge-lighting-en.pdf</a>
 +
</li><li>[7] <a href="https://en.wikipedia.org/wiki/Qt_(software)">https://en.wikipedia.org/wiki/Qt_(software)</a>
 +
</li></ul></div>
 +
</div>
 +
 
</div>
 
</div>
 
<div class="rechts">
 
<div class="rechts">
 
     <div class="scrollbox">
 
     <div class="scrollbox">
 
         <div class="highlights">
 
         <div class="highlights">
             <a href="#intro">Introduction</a><br/>
+
             <a href="#intro">Overview</a><br/>
            <a href="#goals">Goals</a><br/>
+
            <a href="#setup">Setup Overview</a><br/>
+
 
             <a href="#func">Functionality</a><br/>
 
             <a href="#func">Functionality</a><br/>
            <a href="#achie">Achievements</a><br/>
 
 
             <a href="#results">Results</a><br/>
 
             <a href="#results">Results</a><br/>
            <a href="#optics">Optics</a><br/>
 
            <a href="#cool">Cooling</a><br/>
 
            <a href="#softw">Software</a><br/>
 
 
             <a href="#develop">Further Developments</a><br/>
 
             <a href="#develop">Further Developments</a><br/>
             <a href="#build">Building Instructions</a>
+
             <a href="#optics">Technical Details</a><br/> 
 +
            <a href="#softw">Software</a><br/>  
 
         </div>
 
         </div>
 
         <a href="#mainHeader"><button class="back_top_full">Back to the Top</button></a>
 
         <a href="#mainHeader"><button class="back_top_full">Back to the Top</button></a>

Latest revision as of 03:53, 20 October 2016

If you can see this message, you do not use Javascript. This Website is best to use with Javascript enabled. Without Javascript enabled, many features including the mobile version are not usable.

ABSTRACT
Our main task was to develop a device that measures mVenus fluorescence and adds liquids to sample containers. Therefore, our team decided to build a fully automatized pipetting robot that is able to locate a set of samples, detect potential light emission and pipet a specific amount of non-natural amino acid solution into the fluorescent sample.
The foundation for the robot is a 3D-printer, due to the easy handling of movements in three dimensions. By controlling these movements with an optical system, the autonomy of the robot is further increased.


INTRODUCTION

Development of 3D Printers & Possibilities

In the 1980s, Chuck Hull invented the first standardized 3D printer, based on a procedure which is known as stereolithography (SLA, [1]). Moving from SLA to full deposit modeling (FDM) techniques, the 3D printing idea became alive in the do‑it‑yourself community. Ever since that time, simple 3D printers are accessible for little money and due to the open source idea of projects like REPRAP [2] affordable for many. In last years project, iGEM TU Darmstadt has already built a fully working SLA printer, capable of being fed with biologically manufactured plastics [3].
This year, the robotics team decided to rebuild a clone of the Ultimaker 2 FDM printer [4] and exchange the extruder with a camera and a pipet to create a pipetting robot. Using several open‑source parts and software, it is the idea to establish an easy-to-handle robot to assist the daily biologist's work.


iGEM TU DARMSTADT is a young and dynamic team of interdisciplinary and motivated researchers. Our advantage is, that we can bring together synthetic biology and classic engineering sciences, for which TU Darmstadt is famous. We have the possibility, thanks to iGEM, to experiment on our own ideas and to reach for the stars. Being interested in a variety of scientific topics, we wanted to mix up different talents to create a unique project.

GOALS

The main task is to develop a machine which is capable to monitor our organisms and their health condition (encoded by fluorescence) in order to keep them alive. Therefore the machine has to measure the light emission of the organisms and needs to be able to drop liquids into sample containers. This has to be independent of the exact position of the container, which requires an automatic tracking system.
The idea is that one places a container somewhere under the robot's working area and clicks a run button of a program. The robot starts its routine by tracking the new container and measuring the light emission of the organisms. Based on this measurement, the robot decides whether to feed the organisms with non‑natural amino acid solution or not. After a period of time it repeats this routine until the stop button of the program is clicked.
These are only the minimum requirements for our project's needs. We decided to go one step further and designed our robot in such a way, that it serves as a multi‑purpose platform which is adaptable and easy to modify. The open‑source character invites other scientists to add new features or improve the robot and its capabilities.
For example our dispensing system can be upgraded to be able to prepare 96-well plates with samples and monitor routines by using the optical system. Additionally, our measuring head can be changed back to a printer head which allows to 3D print with just a few changes.
There is a vast room of possibilities, just using the concept of the accurate positioning of a sample in the 3D space.
Due to the fact that we try to stick to widely used open-source software and standard commercial parts, our machine can be easily combined with the most DIY products, making it reusable, flexible and cheap.
In the special case of the TU Darmstadt and the next generations of iGEM competitors, we had the idea to develop our technical equipment from year to year and, if possible, combine them. Our SLA printer from last year’s competition was upgraded and is nearly ready to use again, giving us the possibility to manufacture parts for prototyping in our lab. Also this year’s project will serve as starting point for the next year’s technical development team. New ideas and possibilities have been already discussed and we are looking forward to the next year’s competition.

SETUP OVERVIEW
image/svg+xml

FUNCTIONALITY
The functionality of the pipetting robot comprises of the three‑dimensional agility of a 3D-printer and the possibility to pipet a specific amount of non‑natural amino acid using a syringe pump. Also it has intelligent visual object recognition so that it is able to distinguish between samples that require more non‑natural amino acid from samples that still contain a sufficient amount. With that said it is capable to autonomously keep alive the modified E. coli bacteria, given that it is activated and connected to a reliable power supply.
To fulfil the task of keeping the bacteria alive it loops through a specially designed procedure. Initially, the robot scans the working area for samples by illuminating the downside of the sample stage using infrared LEDs and monitoring the shadows of the placed reservoirs with a camera. If the contrast is sufficiently high it is able to detect the edges of the mentioned reservoirs, fit a circle onto it and compute the distance between the reservoir and the camera itself. Furthermore it is possible to put an entire rack of reservoirs under observation due to its ability to locate every individual reservoir.
Shortly after the detection, the distance information is sent to the 3D control program and the head of the robot moves in direction of the first reservoir. To check whether the bacteria needs more non‑natural amino acid the robot uses the fluorescence of the protein mVenus that is expressed by the bacteria. Therefore the robot excites the protein via high power LEDs and detects the emitted light. To exclude reflected light from the LEDs that would interfere with the measurement a longpass filter cuts off the spectrum below the emission peak of the protein. In dependence of the fluorescence signal, the robot decides whether it is necessary to pipet non‑natural amino acid onto the sample. If that is the case the robot moves the samples in z-range just so that the syringe reaches the sample and is able to securely add the non‑natural amino acid.
Eventually the robot recommences the procedure described above, except for the scanning of the individual positions of the samples, which are saved temporarily until all samples are checked. As long as the robot is activated, connected to a power supply and the syringe pump does not run out on non‑natural amino acid, the robot will loop through this whole process and keep the bacteria alive without the need of a human. Nevertheless it is possible to check what the robot is doing via a livestream of the camera visible on a graphical user interface, since there is no other opportunity to look inside the robot itself while it is working.

ACHIEVEMENTS

  • Successfully redesign a 3D printer chassis to meet our requirements
  • Construct a unique lightbox with integrated IR LEDs for positioning purposes
  • Design a measuring probe with a camera device with an integrated optical filter system and LEDs
  • Implementing an automatic object tracking system including a vector based feedback system for positioning
  • Construct a syringe pump system to add liquids down to microliter accuracy
  • Connecting a Raspberry Pi with an Arduino microcontroller by establishing a serial connection between the two devices, allowing a variety of different tasks
  • Data of all CADs designed by the TU Darmstadt technical department
  • A complete construction tutorial including a BOM (Bill of Materials incl. prices)

RESULTS
Circle Detection

The sample detection was facing two problems which on the first glance seem rather simple. Detect a circle and detect a rectangle. The detection of a circle is more easy due to the fact that it is a analytical function. This was already implemented by openCV and we were able to use the circle detection for our sample tracking system. The green circles are the found objects. The red lines are vectors from the central position of our probe to the sample positions. They are used to drive the stepper motors so that our probe can move to them. You can also see, that changing the probes position are recognized and the vectors are recalculated.

Multi-Object Detection

A more delicate task was to detect round samples, stored in a rack. The difficulty is that the objects are intersecting and the algorithm needs to distinguish them somehow. You can see how the camera sees the whole object. Even with our eye it is difficult to see a rectangle. We improved the algorithm step by step using prominent features of the rack. The result is shown in the last picture.

Rectangle Detection

A rectangle detection is more complicated, but we were able to solve this challenge. You can see how the algorithm detect the rectangles and mark them, also tracking the single samples.

Full Functionality

As shown in the video our robot starts with its routine and going from sample to sample, checking for light emission. For demonstration purposes we installed a LED light. One can see, that the optic detect the LED light and then moves down to dispense liquid into the sample, because our pipette is installed sideways on the optical head. Then its moving back to its original position and continues its routine. At the end of the video we show the possible adjustments like brightness and focus.

In this video we additionally demonstrate the recognition of rectangles and circles. Using OpenCV and the Raspberry Pi Cam we can detect our glowing samples, here simulated with a green LED. After the detection of the glowing, in the case of a fluorescent protein induced by the blue high power LEDs, we pipet a drop of the respective substance (in our case nnAA) into the sample.

FURTHER DEVELOPMENTS

Due to a tight time schedule from the start to the end of iGEM it was not possible for us to realize all ideas and planned developments in respect of improvement of the robot itself. For a working process with more kinds of bacteria cultures it is absolutely indispensable to develop a system that is able to avoid all sorts of contamination between the different bacteria. Therefore it would be an option to have an extra reservoir filled with ethanol in which the tip of the syringe can be sterilized between the checks of different samples. Another modification that would be useful for working with individual bacteria cultures is making the power LED's changeable. This is necessary if the the wavelength of the LEDs does not overlap with the absorption spectrum of the fluorescent proteins or overlaps with a part of the spectrum that has a very low absorption efficiency.
Moreover, apart from the latter developments it may be useful to improve the syringe pump system. Instead of using a syringe pump it would be useful to use a system with a reservoir of liquids and a pump that works continously like a turbine, for example see http://www.ardulink.org/automatic-lipid-dispensing/.
Another useful modification of the robot would be to rebuild its foundation, namely an Ultimaker 3D-printer setup. Essential alterations would be to replace the sample stage with a heatbed and to replace the current head with a printhead hotend. Since the current head can be clipped it would not be too much of a challenge. Furthermore, a change of the syringe extruder is necessary, if the printer should work with plastics. An alternative approach is a kind of paste 3D-printer. In this case it wouldn't even be needed to change the head and the syringe, because of the already viscous properties of the paste.

Optics

Operating Range of Wavelengths
The robots optics consists of two big components, a camera head and a lightbox. The camera head is responsible for two tasks, which are the detection and localization of samples, and the fluorescence measurements. The light table illuminates the sample stage uniformly from below, thereby aiding the camera to reliably do the detection work. The fluorescence measurement and object detection are separated in terms of their operating range of wavelengths. All the detection occurs at wavelengths above 860nm, which is near infrared. The light table radiates uniform infrared light, while the camera chip is capable of capturing this wavelength. There is nothing special to the camera chip and in principal all commercially available CCD-Chips can potentially capture near infrared. This is usually an undesirable feature for photographic purposes, since it falsifies image colors. This is why camera lenses are usually equipped with an infrared filter. In our case, we use our own lens system, where we removed the infrared filter. The reason why we chose the detection to operate at infrared is because we are dealing with mVenus, a mutant of eYFP, which does not absorb infrared. Now, if we take a look at the spectra, one can see that the gap between the emission and absorption peaks is pretty small. The absorption peak occurs at wavelength of 512 nm, while the emission peak is located at a wavelength of 528 nm.

Fluorescence Measurement and Filtering
For pulse measurements there is a need for advanced high frequency circuitry, which is capable of forcing the LEDs to emit pulse lengths of picoseconds. LEDs have rise and fall times in the nanosecond region, if they are used in a simple on/off manner. For continuous measurements we have to seperate the wavelength of stimulation from the wavelength of fluorescent emission. Therefore, we used a long pass filter with a cutoff wavelength of 515nm. It is capable of blocking most of the stimulation light, while letting most of the mVenus emission pass. The camera captures a long-exposed image to be further analyzed. To get rid of the residual stimulation light appearing in that image, it is digitally color-filtered, and segmented into regions of interest (ROI). The determination of ROIs occurs simultaneously with the detection of samples. The filtering is very strong, and does indeed block some of the already weak mVenus fluorescence. This is why we use an exposure time of 10 seconds for a fluorescence capture, to make sure enough data is collected.

Optical Hardware - Camera Head
We are using the 8 megapixels PiCamera, because we have access to its capturing settings like framerate, exposure time, gains, light sensitivity etc. over an existing programming interface [5]. This is absolutely necessary since detection and measurement have totally different requirements. Another benefit is, that we are able to capture directly in grayscale (Y part of YUV) for fast detection purposes, and switch to RGB when doing fluorescence measurements. We do not have these degrees of freedom with an ordinary USB camera. However, the disadvantage is, the stock camera itself is equipped with a very minimal lens system. Since our measurements do not only take place at different wavelengths, but also at different distances to the vessels (fluorescence images are taken from each single cuvette, while the camera head is directly placed over its opening), there is a need for adjusting the focus. We have therefore developed a focusing system consisting of a so called voice coil, which inhabits a 3D printed adapter socket for the PiCamera. The adapter socket also includes the optical longpass filter. The voice coil holds a suspended lens, which can be adjusted in its distance to the camera chip. This method is used in most smartphones. In our case, we took our voice coil out of an old webcam. The voice coil is fed with a PWM signal provided directly by the Raspberry Pi's hardware PWM channel. We use a simple L298N H-Bridge stepperdriver to amplify the PWM signal, and to decouple the Raspberry Pi's precious hardware PWM pin. Different duty cycles mean different focal positions. The coil current is tuned with a potentiometer. In this way we are able to automatically focus the lens by evaluating simple Sobel-filter-based sharpness measurements. Our autofocus is finding best sharpness within two seconds, using a robust global search algorithm. It is applied every time a new set of racks and samples is placed into the robot, i.e. prior to each new session. Also, to adjust the focus for individual fluorescence captures, the sharpness of the individual sample corners is considered. The camera head also includes the stimulating LEDs, which are 4 high Power Cree XTE, driven by a 0.9 amp current source and a PWM signal delivered by the Raspberry Pi.

We mounted the self-made autofocus in a cleanroom at "Institut für Druckmaschinen und Druckverfahren" TU Darmstadt. This was to keep the lense dustless. We are thankful to experiment under this conditions, but for the autofocus it is not mandatory. Be just aware of a clean dustless environment!
The lightbox is an essential part of the detection. All applied detection algorithms rely on thresholding the image, or filtered versions of it. The thresholding is basically doing a binary selection of relevant versus irrelevant image information. Therefore, there is always a loss of image information. If there is less clutter in the image, then there is no need to use strong thresholds, therefore conserving more of the image information. The lightbox is acting as a clean background, creating only low amounts of static noise and clutter due to its uniform radiation of light, allowing us to use less strict thresholds. It also emphasizes the samples' corners and enhances the detection reliability. The heart of the lightbox is a Plexiglas panel called “Endlighten”. Light is laterally injected and reflected off systematic impurities inside the panel [6]. The light then leaves the panel uniformly in all directions. Light leaving the panel back side is mirrored to the front side by a white reflective Plexiglas, and additionally diffused by a diffusor plate, also made of Plexiglas. The light is injected by flat-end infrared LEDs which are mounted on 3D printed rails and tightly clamped to the sides of the Endlighten panel. The LEDs are driven by constant current sources to give them a long lifetime.

Cooling
All electronic components produce a significant amount of heat, especially the motor parts, power supply, and the Raspberry Pi. Since the robot chassis is meant to be completely enclosed to keep light out, heat is going to pile up in the upper part of the interior. To protect the samples from temperatures above room temperature, it is necessary to include a cooling system, which ensures proper air circulation, and does not let in ambient light. To fulfill these two requirements we decided to adapt a double-walled cooling system. The simplest implementation of it is based on the fact that warm air rises naturally, and incorporates the power supply as the air intake. The power supply is placed on the bottom of the robot, and draws in fresh air. The air, which is getting warmed up by the interior rises to the top and is let out by a radial fan through an extractor hood, made of a laser cutted MDF grid. The warm air in between the MDF grid and the outer wall is directed through a 3D printed exhaust tunnel. Thus, ambient light is being kept out.
Software
Marlin
Marlin is an Open Source C++ firmware for 3D printers available on GitHub. It is executed on an 8-bit micro-controller, which in our case is an Arduino Mega 2560. Due to the fact, that Marlin is adaptable to a lot of boards and different configurations it was the perfect choice for us to use it in our project. Nevertheless, we had to modify the firmware a little bit, so that it meets all our needs. For example we had to configure so called endstops, that ensure that the sample stage and the head won't crash at a dead end. If you're intereseted in all changes we made, click here.
There, you will also find a link where you can download the original, unmodified version of the marlin firmware.
OpenCV

OpenCV is a cross-platform image processing library and free for use under the open-source BSD license. Its development has been initiated by Intel in the nineties to demonstrate the capability of CPUs in executing complex image processing tasks. OpenCV covers the most basic morphological image operations up to advanced machine learning algorithms. It is written in C++, which is also its primary interface. By now most of the available features have been wrapped for other programming languages like Python, which we are going to use. Especially what we are looking for in OpenCV are its feature extraction algorithms, like the Hough transformation (1) (show details), contour and edge detection (2) (show details), and image moment (3) (show details) extraction. Also we make our lives easier by utilizing its implemented and optimized morphological operators and image filters (4) (show details), like the median filter and the sobel operator.
To enhance the long-term reliability of detection we separate the detection procedure into two steps: Since racks are predominantly used to hold the sample vessels, we firstly extract and detect the rectangular shaped racks with contour and thresholding techniques. Several image shape descriptors are used to track possible positional changes. The user is then allowed to manually assign regional attributes (e.g. radii, heights) for each rack. The assignment of local radii helps the circle detection in the second step. Not only does it inhibit false detections, but also allows for a proper multithreaded apportionment of work.

PyQt

Qt is a software tool to develop a GUI (Graphical User Interface)[7]. It is available under a commercial and an open-source license. The software is a cross-platform application framework, which means it runs on the most computer systems like Unix or Windows. The underlying programming language is C++ and Qt can use already existing programming languages like Javascript, making it a powerful tool.
The main idea of Qt is to use a system of signals and slots to have an easy framework to connect displayed elements with underlying functions. Also, the reusability of already existing code is enhanced. Every graphical element, for example a button, emits its own signal when it is pressed or used. The signal then can be used to trigger an action, like closing a window. If the signal is not connected to a function nothing will happen, however the signal will be emitted with no consequences. Now it is possible to connect the emitted signal with a desired action, called slot, and the program gets its specific behavior.
Qt is widely used by companies like the European Space Agency (ESA), Samsung, DreamWorks, Volvo and many more. To be able to combine the possibilities of Qt with the simplicity of the Python programming language, PyQt was developed. PyQt is a binding for Python capable of translating Qt methods within the Python syntax.
To be able to get a direct preview of the constructed GUI, Qt Designer is a helpful tool. Basically it enables an intuitive way to build a graphical user interface without a need to explicitly coding it. To later work with the code itself, PyQt uses a method called pyuic(number), which is executed through the terminal. The number in the brackets stands for the version.
After converting the code one can open the GUI as a regular python script and work with it as usual.

BUILDING INSTRUCTIONS
The following video shows the assembly of our robot's mechanics. Feel free to jump in the video or download it; so you can watch as many times as you like. All necessary steps are commented.
Construction Video

Circuit Diagram

The stepper motors for X and Y are plugged into the RAMPS board as shown in the graphic. There are already prepared cables available with corresponding connectors. To control and power them the stepper driver (A4988) are plugged in the RAMPS slots, marked with X and Y. On the bottom of the stepper driver it is marked which pin is ground and so on. Also on the RAMPS board it is marked where the corresponding slots for ground etc. are. The X and Y stepper motors have one endstop each to provide a stop signal for the homing routine. It is also possible for safety reasons to add one more endstop to each axis, so that the movement stops for sure at the end positons. All the endstops are plugged into the marked position on the RAMPS board. From left to right the pins are for X min and X max position, Y min and Y max position and Z min and Z max position. The pin marked with a + is reserved for the 5 V, the middle pin is the ground and the last pin marked with a S is the signal. The Z stepper motor has two endstops and is connected like the X and Y stepper motors and endstops. To drive our syringe pump one more stepper motor and one endstop is required and is plugged into the E0 position. It is connected like X,Y and Z.
To power the RAMPS board 12 V are needed, for example from an ATX power supply. The Raspberry Pi3 is responsible for controlling the optics and LEDs whereas the Arduino controls the stepper motors and endstops. Arduino and Raspberry Pi are connected via one of the Pi’s USB port and the USB Type B input of the Arduino. The lightbox is powered with 24 V from the ATX power supply. The lightbox consists of 4 LED bars, each carrying 10 infrared LEDs in series. Each bar is driven by one 50mA constant current source. These are parallel connected to the -12V of the power supply, and to the NC of one Relay. The COM port of that Relais is then connected to the +12V of the power supply. The 2 Relay Modul serves as a Raspberry Pi controlled switch which controls the lightbox and the blue exposing LEDs. The corresponding wire’s number refer to the used GPIO pins. Be careful, the internal GPIO layout is different as the numbering on the Pi layout scheme. All numbers correspond to the latter!
The L298N are DC motor drivers which we alienated as protection circuit, to separate the 12 V from the Raspberry Pi’s GPIO. The blue exposing LEDs are series-connected, and controlled by the 2 Relay Modul on the first side.

Required Software

In case you want to rebuild ou project, you can download all needed software in this section. The software includes the exact marlin firmware, which we uploaded on the Arduino, as well as the software that is installed on the Raspberry Pi and some improved python scripts. Furthermore, to enable wireless support, install and configure "hostapd" and "dnsmasq" on your Raspberry Pi. We followed this external tutorial.

Bill Of Materials
The costs on this bill are just an example for what we paid for the items. It may be that the costs vary from ours, if you are buying them from another shop as we did. Also, it should be mentioned, that a lot of the items we used in this project are 3D-printed or were manufactured in a workshop, so that we didn't have to pay for it. So it is plausible, that the total costs for this project will rise. Nevertheless, if you are still interested in constructing this robot, you can take a look at the bill of materials here.