Zendaya’s dress at the 2019 Met Gala was a recreation the famous Cinderella transformation moment, pulled right out of the 1950 animation. I was brought in to a team of stylists and animatronics designers with just 6 weeks to prototype and execute an LED solution for fabric colour changing in a daylight red carpet setting, where other ideas such as thermochromic dye had proven ineffective. A significant part of the transformation effect was a servo-controlled dynamic crinoline, of which Hussein Chalayan’s kinetic catwalk dresses in 2007 were the main precedent — with members of that original team also working on this project.
The narrative reasoning behind the Cinderella dress was specific to Zendaya’s career trajectory, but for more on the general theme of the event this year read the excellent short essay by Susan Sontag: ‘Notes on Camp’.
The key thing in creating the ‘colour painted’ effect is making the light look saturated on the view surface — everyone’s first thought of using the primary fabric as diffuser to direct view LEDs is ruled out because of a number of reasons; any sort of even glow or homogeneous quality requires a void depth which can’t be engineered into this kind of structure, plus effective diffusion by its nature knocks back the majority of the intensity. Finally, diffusion restricts your surface material options to lighter colours, so reducing the ability to work with contrast to enhance any transition effects. Here you might also rule out other nice materials like asymmetric linear spreader optical film — which can give you an uninterrupted direct view line of light with minimal depth from the source — as light intensity is again unavoidably cut significantly.
An obvious precedent for illuminated dresses was Claire Danes’ in 2016 by Zac Posen, which used fibre optic — a beautiful dress that only works in the dark! As anyone who has worked with side-emitting fibre will be able to tell you, the output is always disappointing. Please direct all the Lindsay Lohan supporters over to me!
The solution was found with front illuminating a highly reflective fabric with a mix of both specular and diffuse qualities, something I would describe as satin. In effect we create a sort of direct cove detail in the fabric seam. This of course has to maintain flexibility to perform as a garment while held open enough for the light to escape, and keeping a distance between the surface and source in order to increase travel. There were two ‘cove’ types used on the dress, making use of side-emitting LED tape — for the bodice a more rigid prescriptive approach with a 3D-printed profile element that integrated onto a leather paneling. The form itself designed in Rhino with aid of a 3D scanner, allowing me to track the spline to the real fabric paneling elements that had been sewn so far, and aim the angle of attack of the LED chips differently along that spline.
The main skirt lighting profile shape was achieved with sticky-backed foam tape spacing at the peak of each pleat, fabric strip then sewn over, loosely enough to not close this opening. This covers the light ‘flare’ of directly seeing the chip. When constructing our approach to this the consideration of bias was important, as the non-stretchable (yet flexible in one direction) LED tape can fight the fabric’s nature. Multiple surface layers of organza and tulle were later added to more closely match the narrative source.
Side-emitting sequenced form factors are very limited, SK6812-SIDE is the one to go for, still a little hard to source in that there are no standard next-day delivery distributors and you have to use China for any significant quantities. The offset 4020 package allows for a lot more options than the standard 5050, for example you can double up the 90led/m tape face-to-face so the chips are ‘zipped’ together to create an almost seamless 180led/m. I abandoned this due to requiring a more complex data distribution pattern and assembly process, going with the still impressive 144led/m. Other elements that were left in the design process include sticky backed Aluminium tape for wicking heat, due to stiffness and weight concerns, and lack of airflow making any sinking attempts ineffective anyway.
I chose to over-supply the LEDs at 6v, this is within tolerance, allows for reduction in wiring gauges, reduces voltage step required by the regulators thus the heat they produce, and removes voltage drop concerns that result in the need for software colour correction. The negatives are lower lifespan and more heat dissipated by the LEDs themselves (don’t do this under normal conditions), pay consideration to how far your control logic voltage strays from the ranges too.
The light effect was needed for an unknown amount of time, anywhere up to 15 minutes, with that key time-sensitive transition moment achieved by remote trigger. Existing RF products for remote triggering impressed me — mainly distributed for the hobbyist drone market — you can find relatively cheap modules to convert 2.4GHz to formats like PWM, but in this case I used SBUS contact closure relays to enable ‘scenes’ in a local controller.
The initial thought was an ArtNet to SPI decoder that could store scenes locally, such as the LeDMX4 Pro with a microSD card, but this ended up being overkill for the intended content — the mapping requirement only needed to be linear, rather than distinct radially, allowing the data line to be shared for multiple runs and significantly minimise complexity.
The Arduino Pro Micro was chosen in the end as it has 5v logic output, good supply voltage tolerance, and on-board bootloader in a small form factor. Small circuitry additions were pull-down resistors to avoid floating inputs, old school through-hole are perfect for this. Very wide heatshrink was the only casing used for this board to keep the thickness to a minimum and open up mounting options.
A portable power solution was key to get right. This amount of LEDs fighting to be seen indirectly in daylight, despite their relative efficiency, required 450W (even after adjusting for colour and content). Drone Lithium Polymer batteries totalling 29600mAh used were capable of 370A constant discharge, meaning any unfused shorts would ignite the fabric and even the silicone wiring after a short amount of time.
You have to buy locally with LiPo as international shipment is restricted due to this fire hazard. 60A fuses in series with the power switches were an essential safety feature, though if they were blown they were not designed to be changed quickly, instead I chose soldering them in line with the power harness to minimise space as much as possible. Their weight and mounting was dealt with by suspending them from the substructure panel that the actuators were attached to, allowing us to retain the tight waistline silhouette of the dress.
Other recommended alternatives included lithium iron phosphate — Facebook groups with practicing engineers and designers continue to be the best source of this kind of information — but this amount of current on a wearable is still relatively uncharted territory and not recommended for persistent showtimes.
The hottest elements in the end were the voltage regulators which were being used to step down from 7.4v (lower depending on capacity) to 6v. A silicone heat resistance mat normally used for soldering surface protection was the contact fireproofing against the fabric, this solution only works for the relatively short period of time we needed, lack of airflow meant the components did still become incredibly hot under the main dress, albeit not directly touching skin or fabric.
The primary ‘distribution harness’ snaking around the waist — and allowing the dress material with attached LED to be connected as a separate element — uses shielded microphone cable, with ground maintained across each data junction, plus re-injected at multiple points to preserve a low resistance reference. Surprisingly no other standard anti-noise measures like capacitors were needed at this scale.
It wasn’t practical to test animations using the actual dress LEDs and power distribution for much of the build as it was being developed concurrently with the animatronics or stitching, so I used a demo tool consisting of a clone controller with output connectors as a test rig. This was a modified version of a PCB I’ve developed for my own use, and gives the ability to have a trigger input and rotary encoder input with its value readable live with the serial monitor — allowing for things like fine colour adjustment that the client themselves could play with, and for me to quickly iterate. IF statements can be used in the sketch to isolate all logic blocks so you don’t get any interference when test patterns are kept with the concurrent live program.
I used FastLED library as it’s well supported and has low dynamic memory usage. Multiple more complex effects blended can’t be written at the same time as is common in lighting/VJ software as the library doesn’t have a smart way to combine simultaneous frame effects to the same address outputs, and updating nested functions in the code tree causes a lot of issues. The way a program like ELM tackles this is using Photoshop-style effects blending, calculating your mapped outputs after the fact. Ways around this using arduino IDE include writing changes to occur on a isolated dynamic range of addresses, and then changing the range boundary values for the effects while inside your main loop.
FastLED also has some interesting functions for working with colour, the use of a modified-range HSV colour map over RGB offers some more balance, and can automatically invoke ‘temporal dithering’ to do a kind of gamma-correction on low ranges of the PWM. Commands like fadeToBlackBy() and random() became hugely useful in achieving dynamic effects that appear organic. Other fundamentals that had to be used were non-blocking delays, sometimes referred to as state machine logic, and time-debounced inputs to increase response stability.