I work on the interface between people and technology. By developing new ways of interacting with our environment through intelligent systems, I aim to bring the two closer together. As technologies around us become more complex, efficient communication between the user and the product are becoming ever more important. I strive to develop products that are technically challenging, but feel natural to operate. If you are looking for a freelancer within the field of industrial design, technology development or robotic applications, please feel free to contact me.
Paolo Falco Rüegg
55 Sheffield Terrace
W8 7NB London
I met Seyi (sosafresh.com) at the Queen Elisabeth Prize for Engineering awards, which Gravity Sketch was nominated for. One thing led to another and I joined the startup for a month before my next year at university. Gravity Sketch is the first mixed reality 3D creation tool designed from the ground up for touch and gesture input. They are striving to make digital 3D creation easier for designers, artists, and makers alike. I worked mainly on content creation for their upcoming Kickstarter campaign, creating animations, clips and (video) renders. Furthermore, I worked with and on their new VR version, which allows for an extremly immersive 3D creation experience.
Below, you can get an impression of what I was doing and working on. Part of the internship also included making this website a bit tidier and better looking.
19% of the UK population have some form of limitation or impairment, and a fifth of these individuals have trouble accessing current public transport. With that in mind, we designed the interior of a novel transport solution for the physically impaired called AVII (Autonomous Vehicle for Increased Independence). The design specifically caters for wheelchair users, people with auditory and visual impairments, as well as the elderly population. At the heart of AVII is its narrow and compact form, allowing it to park perpendicularly. Due to its dual door design, (wheelchair) users are able to board from the back and exit to the front of the vehicle, without having to perform tricky turns. We started from scratch and did lots of background research, evaluated user needs and translated these into product specifications. Initial concepts were then concatenated and crystallised into the final design (as shown in the title image). The six members of our team then constructed a full-size buck according to a complete CAD model. Please note that this was a group project and not my sole work.
Sensorium is a cross-disciplinary project bridging art and science that creates collaborations across a variety of different faculties of Imperial College London. It is an immersive environment exploring the complexity of scientific pursuit that students of this institution face during their studies. The journey of an individual is conveyed in the mysterious and unfamiliar setting with the spectacular interactive piece at the end. Exploring the path through shadows of unknown to the final reveal, the visitor gains a unique insight of combined engineering, science and design.
I was thoroughly involved in the ideation phase and later on as a tech engineer, working on interactive LED panels. The project was accomplished over six weeks and culminated in the two-day exhibition for Imperial Festival. We had a lot of positive responses and were given a special award for our commitment. Below, you can see some of the impressions of the exhibition, as well as the making of video.
Self-built guitar pedal inspired by Make’s optical tremolo. The disk in the middle of the picture has a black and a transparent part, with a motor underneath spinning it. Beneath the switch a photoresistor is mounted onto a Japanese coin (they have a hole in the middle so that came in handy). When light shines onto the photoresistor, the guitar signal is inhibited. When the resistor lies in the shadow, the signal passes through normally. Because the disk spins, it casts a spinning shadow onto the wood board and onto the photoresistor. The shadowplay produces a tremolo effect. Works best in shady bars, as the day light buffers the constrast between light and dark. Below you can see and hear a demo of the pedal.
One switch, five LEDs and a bit of textual support. Finding out how drunk you are has never been easier. An MQ-2 gas sensor measures the alcohol in your breath. The signal is passed on to an Arduino UNO, which does some calculations with it and turns on the respective LEDs. When the blue one lights up, you smell like this fine bottle of rum.
The supermarket food database is a Python-based software solution for shops and supermarkets with perishable goods. It facilitates bookkeeping of all the foods on sale at a given moment. Specifically, the software is designed to show expired and expiring goods so that they can be reduced in price or replaced. The user interface is powered by Qt and linked to the Python database via PyQt4. The entries are shown in both a list and on a dynamic map. This project was created for a computing module about linear data structures, algorithms and object-oriented programming. The database uses very low-level data structures such as binary search trees. The source code and user manual are available here.
As part of an assignment, we were asked to combine a random brand and product. I was given the brand Hendrick’s Gin and had to design a grass trimmer. This is the result.
The example proves how even the most peculiar products can be made credible by integrating them into a brand image. Likewise, combining unusual brands and products can lead to interesting ideas. Please note that this is a mock-up and does not in any way represent views or interests of Hendrick’s Gin.
The aim of this project was to create an electromechanical machine that generates sound by integrating machine elements, sound design and technology. My group decided that our machine had to create harmonies, look contained, be sleek and provide nice visuals. The project evolved from these initial requirements to a fully formed idea of two instruments being played simultaneously. In the final design, chimes provide a melodic sound, while the cymbal generates a continuous background noise. The whole machine is controlled by a Raspberry Pi and can be played with through a user interface. Please download the project report here if you wish to know any more about Chymbal. I compiled a few interesting photos and videos to give you an impression of the build. Please note that this was a group project and not my sole work.
I collaborated with the University of Zürich and conducted a study about music. More specifically, the study deals with how our brain behaves when we improvise on an instrument. I used electroencephalography to measure the brain waves whilst playing note-for-note and whilst improvising. Analysis of those signals showed that there is a massive increase in the theta frequency range when improvising. The inverse solution of these signals pointed to increased activity the anterior cingulate cortex. The whole paper can be downloaded here. The paper was awarded both within school and by the external organisation Impuls Mittelschule. I was able to present my findings to Zürich's commission on education and to the city board. Furthermore, the paper was displayed at the townhouse and at ETH Zürich.
This is my improved design of a Mini Wheg™. Whegs are a combination of a wheel and a leg and are used in robot locomotion. The main advantage is that the robot is able to climb obstacles greater than the radius of the wheel (which usually limits how high it can climb). This robot only measures 20cm in length and 8cm in width and is modelled mostly off stock parts, while the others can be 3D-printed. I performed full engineering analysis and calculations to determine RPM and torque of the motor required to achieve sufficient locomotion. From there I have designed a spur gear / planetary gearbox as well as a belt drive system. Furthermore, I proposed a novel grab and drag mechanism towards the rear of the robot, which allows the robot to transport other objects. Applications are wide-ranging, such as in cleanups after natural catastrophes. The model was made in Solidworks. Below are some more pictures.
My friend Sebastian Suter and I bought this beautiful old bike out of a barn somewhere on the Swiss countryside for about £300. Over the course of almost two years we stripped the bike down to the frame, replaced what needed replacement and added lots of love. We gave the frame, tank and a lot of other parts a new powdercoat. This project took us ages, just because neither of us had any clue about motorcycles. In the end it turned out that reboring the engine would have probably sufficed to get it running. We realised that after we've taken the whole engine apart. Special thanks go out to almost every motorcycle mechanic in and around Zürich who has helped us get this beast to run. It passed MFK (MOT) and was driven around for about a year. Now a seal in the crankshaft, which is next to impossible to access, is broken. We're thinking to take out the infamous two-stroke engine out all together and replace it with a slightly more modern one. Updates will follow.
The razor market is currently oversaturated with short-lived and overengineered products. The brief for this project was to design a sustainable razor within a circular product service system.
Air is a minimal and refillable razor that uses the least amount of material possible. It has an innovative cartridge change mechanism, which is engaged when the two sides are pressed together. The hinges separate and a new cartridge can be inserted, as shown in the picture below.
Other interesting features include a streamlined and ergonomic shape and are presented in the annotated concept sketch.
The material choice was brass, as it has a high perceived value and fulfilled the various technical requirements. With regards to sustainability, it is widely recycled and could be sourced from industry offcuts. A Life Cycle Assessment was produced using CES and is shown below.
Finally, the product was also integrated into a proposed service system. In order to achieve circularity, users would collect and eventually give the used cartridges back for recycling.
Design a portable, low cost device for measuring tidal lung volume and residual lung volume that is non-invasive and microbiologically safe.
Chronic lung disease is an increasingly common reason for disability or death, especially in low-income settings where the appropriate tools to diagnose and treat aren’t available. A spirometer is a device that measures lung volume by measuring the amount of air being inhaled and exhaled. It can be used to see whether the lungs are obstructed or narrowed. Spirometers are not currently available in low-income settings due to the expensive nature of the equipment. Our group therefore decided to build a low-cost spirometer with a separate add-on enabling measuring the total lung capacity.
We performed user analysis, including background research, stakeholder analysis and customer journey mapping to establish the context of use. After having defined core product requirements, these were translated into a product specification. Subsequently we started generating concepts, as illustrated on the following diagram.
Three concepts to measure air flow were tested using low-fidelity Arduino prototypes. Hot wire anemometry, which relates the cooling of a wire in an air stream to its flow rate, was chosen. This sensor was low in cost and fulfilled the hygienic requirements for a medical device. The air channel shown below was designed to be able to measure accurately and achieve flow uniformity .
With regards to ergonomics and form, we designed a main enclosure around the air channel. A modular system allows the user to slide in a disposable mouthpiece. On the other side the add-on for the extended measurements could be slided in.
Through blue foam modelling, the team determined the optimal grip and position of the handle. The final CAD of the enclosure is shown below.
We then started to integrate the technology into a contained device. Unfortunately, it was not possible to accomodate all components in the above geometry. Because the top priority was to create a contained and functioning device, however, the team decided to enlarge the enclosure for the purposes of the technology mock-up. It used a RaspberryPi that took readings from the sensor. An exploded view of the workalike model is shown below.
The final step was to create the user interface. We did not use any displays to keep the cost down. Instead, all the data was sent from the Raspberry Pi to a smart device via VNC. It was assumed that the operating body or the technicans that would implement Breeze would have access to smart devices. I programmed a GUI that runs on laptops, tablets and computers. The layout is shown below: Finally, here is a video of the tech mock-up in action.
Continuum is an interdisciplinary installation that celebrates creativity and idea generation. Every visitor receives a ping pong ball which he or she customises. The ball is then inserted into the Rube Goldberg style machine, where it undergoes an exciting journey. The user needs to accompany his or her ball through the machine, interact with it, and see how it changes. The whole machine stands as a metaphor for creativity, with the balls representing different ideas. Some shine bright and make it, and others are lost in the chaos.
This was in some ways the sequel to Sensorium and I was more involved on the conceptual side and then the final execution. We had incredible feedback and a few weeks later we were asked to present the machine and the Victoria and Albert museum! The making of and V&A video are shown below:
I supported Produkt Design Zürich over my summer break for six weeks. It was a lot of fun and taught me a lot about design consultancy work. There were always multiple projects on at one time and it was key to be able to quickly switch between different jobs and feed in knowledge from different threads. Because none of the projects I have worked on have launched yet I can't go into too much detail about the actual products unfortunately. I worked in the home security, medical and industrial testing field. My responsiblities were very varied, from meeting customers to building functional prototypes with low-fidelity electronic platforms. I spent a lot of time doing CAD, designing enclosures, and getting them ready for rapid prototyping. They needed to balance a lot of requirements such as ergonomics, ingress and impact protection. At the same time I also did conceptual work, ideating viable products for a well-known Swiss kitchenware brand. Check out some of their work here.
Robots have the potential to solve big problems. They already assemble cars and explore planets far away, but they are usually kept away from us. What keeps robots from helping you clean your dishes or carry the moving boxes? It is the fact that they do not fully understand their environments. Usually dependent on a single vision system, they do not know where you are, which inhibits intuitive interaction and and makes them potentially harmful.
SIO is a skin that gives robots a sense of what is around them as well as a sense of touch, thus making them collaborative. If it senses that a human is in its way, it knows to stops. If a box is slipping out of its gripper, it knows to use more force. Our robot skin works like the compound eyes of a bee, seeing not only through a pair of eyes, but through many more. Because the system can be integrated with the robot’s motion planning system, SIO eliminates the need for particular skills like programming to operate the robot. Some of the most interesting user interactions that become possible are shown in the video above.
The two main technical features, proximity and touch sensation, have been prototyped and taken through several stages of technical development showing clear proof of concept. Through innovative material science SIO not only gives the robot perception, but also protection of itself and its surroundings. Equipped with LIDAR sensors it has 360 degrees of vision, adding the level of safety needed for intuitive interaction with humans. The system can be integrated with the robot’s motion planning system. In this way, SIO eliminates the need for particular skills like programming to operate the robot. Proximity and touch detection were prototyped in two stand-alone demonstrators and shown below.
You can check out the project blog here. This project followed a very rigorous design process analysing the current market, gathering industry feedback, proposing a product service system and constantly feeding insights back into the development. It is, along with more details on technical development, presented in the project report that you can access here. The team would like to thank the Imperial College Hackspace for their financial support throughout the development of this project. The ICAH were kind enough to award the SIO early development with a grant along with professional guidance. We would also like to thank the following people and institutions with their invaluable support in providing feedback to our project.
The goal of this project was to create a fully automated chess playing robot using a FRANKA robotic arm. The project was written in Python, while ROS was used to interface with the robot. This was a team project where I worked on the Perception module. It was concerned with understanding what moves were made by the person and the robot. We used a RBG-D camera relying on open source machine vision libraries to build the module. Below follows a demonstration of the chess robot, where I am playing a game that I am doomed to lose.
This project is very well documented on a 'Read The Docs' page, detailing all the different modules, videos and of course source code. We stuck to current Python standards such as PEP257 and would very much like to see someone else pick up on this. If you are interested I highly recommend you have a look through it here.