Using an Artificial Fingerprint to Improve Tactile Sensitivity in Hand Prostheses and Robotic Systems

By Madeline Dubelier, THINK 2014 Winner


The current technology for tactile sensitivity in robotic and prosthetic hands is developing but not widely accessible. The human fingerprint not only makes each human individual, but also allows us to have the ability to feel and sense all types of surfaces and discriminate different objects. The hundreds of ridges in our fingertips increase the surface area, which hold thousands of nerve endings. In this project, I built a robotic and prosthetic hand and integrated the unique design of a human fingerprint into it to increase its tactile capability. The artificial fingerprint consists of a pad-like attachment connected to the robotic finger with small flaps or raised portions embedded with force-sensitive resistors (FSRs). When the fingertip is run across a surface, depending on the amounts of pressure that the different sensors detect, the robotic finger detects whether the surface is rough or smooth. This design can ultimately be implemented in robotic systems to increase the sensory capabilities in autonomous robots or in prosthetic hands to help amputees regain their lost sense of touch.


  • Original
    • ✓ Develop a design for the artificial fingerprint that is best suited for texture discrimination
    • ✓ Build the artificial fingerprint and test its ability to differentiate between rough and smooth surfaces
  • Current
    • Standardize test protocols
    • Find out the rough vs. smooth criterion that has the highest accuracy and program it into Arduino


  • Redesign the artificial fingerprint
  • Construct the new artificial fingerprint
  • Write a program to discriminate between rough and smooth surfaces using the difference in sensor readings
  • Test and modify the program
  • Write a revised program to discriminate between rough and smooth surfaces using the standard deviations in sensor readings
  • Iterate on testing and modifying on the program


The current state of my progress is that I have built the finger and am working to establish an accurate rough vs. smooth criterion. First, as I began to build the artificial fingerprint, I quickly learned that the ways that I originally proposed to build the fingertip in my project proposal were not realistic. Instead, I created a slightly different design that incorporated features from both designs in my proposal.

The finger structure was made using a 0.5-inch dowel. Using a saw, I cut out a section of the dowel for the fingertip. I then made a mold with clay that was the shape of the dowel cutout and used two glass bead imprints. First, I placed two glass beads in the mold using the imprints and then filled the mold with silicon caulking (the same used for plumbing). When the tops of the beads were no longer visible, I layered the two FSRs so that they were each directly above one of the beads. The rest of the mold was then filled. The caulking took about 24 hours to dry. Once removed, I soldered wire to the end of the sensors. The wires then plugged into the breadboard. At first I was using hookup wires, but they were breaking the FSRs because they were too stiff. To fix this, I used stranded wire that was thinner and more flexible. Because this connection was still not perfect, I used electrical tape to reinforce the connections. The fragility of this design is something that I would like to improve on in the future.


My decision to use Arduino to program my prototype was slightly ambitious considering that I had little programming experience. Thankfully, with the help of my teacher and online tutorials, I was able to learn programming, and I am very pleased with my increased proficiency with the Arduino platform as a result of this project. Originally, to discriminate between rough and smooth surfaces, I decided that examining the difference in values between the two FSRs would be adequate. The logic behind this is that if the surface is smooth, then there should be no difference between the two sensors' values. If the surface is rough, then the different sensors should have different amounts of pressure being applied to them and therefore have a sensor difference greater than zero. Between narrow and thick grating, the intuition is that the narrow grating should produce more fluctuation than the thick grating and therefore be rougher. This program worked, but it was not consistent enough because it only looked at the difference in one instant instead of over a period of time.

My next idea was to look at the standard deviation. The rationale behind this criterion was that if the standard deviation is high, then that indicates that there is a large range of sensor values produced by a rough surface. I created a program that calculated and outputted the standard deviation. Next, I ran a series of tests over rough and smooth surfaces to determine the threshold for the standard deviation that defined the boundary between rough and smooth surfaces. However, when I went back and examined the data, I was not able to establish a rough vs. smooth threshold.

Currently, I am working to develop the criterion by looking at the distributions in the histograms of sensor values. I ran some preliminary tests (not yet perfectly standardized) to get an idea for what these distributions would look like based on the different surfaces (graphs in the next section): smooth, rough (thick grating), and rougher (narrow grating). My intuition is that a flat, wide curve matches a rough surface, while a tall, skinny curve matches a smooth surface. While I have not finished programming this, I believe this is the most promising route and will continue to pursue it.


The range of the sensor values are represented on the x-axis of the graphs. The sensor values come from voltage inputs in the Arduino between 0 and 5 V that are turned into raw numbers between 0 and 1023, where lower voltage numbers directly correlate to more pressure. We did not see sensor values below 800 because the amount of pressure necessary to get sensor values under 800 is very large, so we divided the attainable range into 12 buckets on the x-axis to plot the histogram.

Smooth Table
Sensor Values
Sensor 1 Sensor 2
Minimum 963 884
First Quartile 975 906
Median 985 928
Third Quartile 996 986
Maximum 1018 1023
Mean 987 946
Standard Deviation 14 47

Thick Grating
Sensor Values
Sensor 1 Sensor 2
Minimum 910 879
First Quartile 965 988
Median 1023 1011
Third Quartile 1023 1023
Maximum 1023 1023
Mean 999 997
Standard Deviation 35 35

Narrow Grating
Sensor Values
Sensor 1 Sensor 2
Minimum 989 988
First Quartile 999 1009
Median 1009 1017
Third Quartile 1018 1018
Maximum 1023 1023
Mean 1008 1014
Standard Deviation 11 8

Lessons Learned

This project has taught me a lot, not only about programming and building but also about the design process in general and how to deal with obstacles met along the way. At the beginning of the programming stage of the project, I was constantly going to my teacher to ask how to do certain things. However, by the end of this project, I am able to proficiently write simpler Arduino code, which is a big step, and something I feel really proud of. In addition, I learned how to use more of the materials and equipment in our lab. I became better at soldering more intricate electronics and learned how to use the Arduino and its different sensors. Most importantly, I encountered more obstacles than I had originally expected along the way. Since I compete in robotics, I have encountered many building and programming obstacles as a team. However, considering that this was independent project, I learned the importance of asking for help and bouncing ideas off of other people. Overall, this was a really great opportunity to learn more about a subject that I enjoy and also about the engineering process.

Future Work

This THINK project will be the start of a 1-year school project that runs through spring 2015. My plan for the next year is to create a glove for prosthetic devices that is equipped with a variety of sensors. Through sensory substitution, an amputee would be able to wear an armband on their residual limb that gives sensory feedback through the glove. I plan to use temperature sensors, Peltier chips, vibration motors, and more FSRs to be able to recreate the senses of texture (as a continuation of this project), temperature, and pressure. I would also like to 3D print a robotic hand (similar to the InMoov robot hand) and use it with the glove. There are many possible directions for this project, and the impactful applications really excite me.