Imagine the scene: white sandy beach, turquoise waters, the sound of waves gently rolling into the shoreline. You think to yourself, “This is where I want to spend my day or maybe the even the rest of my days.” As you gently begin to drift away mentally, reality sets in for a moment and you realize that the fireball in the sky is going to cook your leathery exterior in about 10 minutes if you don’t put some sunblock on. You reach over and grab your bottle of scientifically formulated SPF sunscreen, and lather on a big handful of liquid…DNA?
The vacation scenario sort of comes to screeching halt when you hit that last sentence. Yet, the question you should be asking yourself is if the tale is really all that far-fetched? Investigators at Binghamton University, State University of New York (SUNY) didn’t think so, as they just released data on their development of a coating made out of DNA that gets better at protecting skin from ultraviolet (UV) light the more you expose it to the sun—in addition to keeping your skin hydrated. Findings from the new study were published today in Scientific Reports in an article entitled “Non-Ionizing UV Light Increases the Optical Density of Hygroscopic Self Assembled DNA Crystal Films.”
The research team was able to develop thin and optically transparent crystalline DNA films. After irradiating the films with UV light, the scientists found that the more they exposed them to UV light, the better the films got at absorbing it.
“UV light can actually damage DNA, and that’s not good for the skin,” explained senior study investigator Guy German, Ph.D., assistant professor of biomedical engineering at Binghamton University. “We thought, let’s flip it. What happens instead if we actually used DNA as a sacrificial layer? So instead of damaging DNA within the skin, we damage a layer on top of the skin.”
Beyond the acute effects of UV exposure, such as sunburn, previous studies have also shown that UV irradiation leads to premature aging of the skin and leading forms of skin cancer, such as basal cell carcinoma and melanoma. The UV light spectrum is divided into four segments, of which only two, UVA and UVB, have been associated with skin damage and cancer. The UVA/B wavelengths are where the University of Binghamton researchers focused their attention.
“We report on ultraviolet (UV) light induced increases in the UV optical density of thin and optically transparent crystalline DNA films formed through self-assembly,” the authors wrote. “The films are comprised of closely packed, multifaceted and sub micron sized crystals. UV-Vis spectrophotometry reveals that DNA films with surface densities up to 0.031 mg/mm2 can reduce the transmittance of incident UVC and UVB light by up to 90%, and UVA transmittance by up to 20%. Subsequent and independent film irradiation with either UVA or UVB dosages upwards of 80 J/cm2 both reduce UV transmittance, with reductions scaling monotonically with UV dosage.”
Dr. German added that “if you translate that, it means to me that if you use this as a topical cream or sunscreen, the longer that you stay out on the beach, the better it gets at being a sunscreen.”
Additionally, the research team found that the DNA coatings are also hygroscopic, meaning that skin coated with the DNA films can store and hold water. When applied to human skin, the films can slow water evaporation and keep tissue hydrated for extended periods of time. This was an exciting discovery as the DNA films could potentially be used as a wound covering for exposed environments.
“Not only do we think this might have applications for sunscreen and moisturizers directly, but if it’s optically transparent and prevents tissue damage from the sun and it’s good at keeping the skin hydrated, we think this might be potentially exploitable as a wound covering for extreme environments,” Dr. German concluded.