Light and billowy, coarse and heavy, clingy or stiff as a board – the choice of textile has implications for the fit and drape of a garment, and the way it moves with its wearer. The EU-funded project FABRICMETRICS has paved the way for the commercialisation of innovative technology simulating the appearance and motion of clothes.
‘We have achieved a level of realism in simulating the behaviour of these fabrics that no other model has attained so far,’ says Miguel Otaduy of Universidad Rey Juan Carlos, Spain. ‘Models that were able to create small patches of a fabric did already exist, but they were not really able to simulate a whole garment at the resolution that we have reached.’
The researchers’ approach involves modelling the behaviour of the yarn in the fabrics thread by thread, rather than that of the textile as a whole, using an approach developed in predecessor project ANIMETRICS.
As of April 2019, Seddi Labs, a spin-off company set up to take this innovation to the market, is putting the final touches to a product intended for the fashion industry, says Otaduy, who led both projects.
Generally speaking, the aim is to create the first true engineering solution for the fashion industry, Otaduy explains. The product about to be launched could reduce or potentially even obviate the need to run up physical prototypes of proposed new garments. ‘What we are trying to do is create true virtual replicas of items of clothing,’ says Otaduy. ‘We call them “digital twins”.’
Rendering of a virtual dress, produced using technology of Seddi Labs, spin-off of the FabricMetrics project.
@ DESILICO SL (Seddi Labs)
One main thread
Textiles were just one area of interest explored in ANIMETRICS, Otaduy notes. Other items studied and simulated in this five-year endeavour included liquids and skin.
‘It was a very broad project in the type of phenomena that we wanted to tackle, and for which we tried to see whether some generic theories about how to create the models would apply to all these cases,’ he explains.
Every one of them involves particles that are connected or interact, but that move independently to some extent. ‘When they are in motion, the deformation that results involves a multitude of particles that are moving at the same time,’ Otaduy says.
ANIMETRICS approached the combined effect by focusing on these components rather than directly on the material as a whole, to work out how individual parts – patches of cloth, for instance – move and deform. These observations are then used to predict the behaviour of the full object or phenomenon by merging the various small-scale changes.
In terms of the animation of fabrics, this approach enabled the team to model the millions of points where lengths or loops of yarn connect in a woven or knitted garment, says Otaduy. ANIMETRICS, which ended in December 2016, was backed by a grant from the European Research Council (ERC) which enabled him to consolidate and grow his budding research team.
The animation of fabrics provided some of the most promising leads from this initial project, with two patents filed. These advances were taken forward in FABRICMETRICS, which notably developed the modelling program and the business plan for the future spin-off. This project, which ended in June 2018, also benefited from ERC support.
With this technology almost ready to make its mark on the world’s catwalks, Otaduy and his team are already considering improvements and other ways to help drive new trends in animation. And the momentum created with ANIMETRICS and FABRICMETRICS continues to build, he reports.
Among other activities, his group is now involved in two new EU-funded projects. One of these is the RAINBOW training network which focuses on applications in medicine. The other, named TOUCHDESIGN, is dedicated to the simulation of skin and touch, for example with a view to optimising the design of physical objects as varied as clothes and furniture, or in terms of our interaction with virtual reality.
Miguel Otaduy is an associate professor of computer science at Universidad Rey Juan Carlos, Madrid, where he leads the Multimodal Simulation Lab http://mslab.es. He works on physics-based simulation, with application to computer animation, surgical training and planning, virtual touch, or digital fabrication. He received Consolidator (TouchDesign) and Starting grants (Animetrics) from the ERC.