Physicists sometimes get a bad rap. Theoretical physicists even more so. Consider Sheldon Cooper in the TV sit-com The Big Bang Theory:
Sheldon: I’m a physicist. I have a working knowledge of the entire universe and everything it contains.
Penny: Who’s Radiohead?
Sheldon: (after several seconds of twitching) I have a working knowledge of the important things in the universe.
But a working knowledge of anything is always informed and arguably improved — even transformed—by robust and analytical “thought experiments.” In fact, theoretical physics is key to advancing our understanding of the universe, from the cosmological to the particle scale, through mathematical models.
That is why Mikhael Semaan and others like him spend their time in the abstract, standing on the figurative shoulders of past giants and figuring out what could happen . . . theoretically. That Semaan is also one of the celebrated postdoctoral researchers/mentors in the Science Research Initiative (SRI) is a coup for undergraduates at the University of Utah who learn by doing in a variety of labs and field sites.
“The SRI is awesome,” Semaan said. It’s “a dream job where I can continue advancing my own research while bridging the gap in early undergraduate research experiences, giving them access to participation in the cutting edge alongside personalized mentoring.”
Want to learn how to bake something? Hire a baker. Better still, watch the baker bake (and maybe even lick the bowl when allowed). And now that Semaan’s second first-author paper has just “dropped,” students get to witness in real time how things get done, incrementally adding to the trove of scientific knowledge that from past experience, we know, can change the world.
The physicist and writer C.P. Snow said that the first three laws of thermodynamics can be pithily summarized with, “You can’t win. You can’t even break even. You can’t stay out of the game.” Semaan elaborated on the second law, “The universe must increase its entropy—its degree of disorder—on average…[b]esides offering an excuse for a messy room, this statement has far-reaching implications and places strict limits on the efficiency of converting one form of energy to another … .”
These limits are obeyed by everything from the molecular motors in our bodies to the increasingly sophisticated computers in our pockets to the impacts of global industry on the Earth’s climate and beyond. Yet in the second law’s case, there’s a catch: it turns out that information in the abstract is itself a form of entropy. This insight is key to the much-celebrated “Landauer bound” that states that learning about a system — going from uncertainty to certainty — fundamentally costs energy.
But what about the converse situation? If it costs energy to reduce uncertainty, can we extract energy by gaining it—for example, by scrambling a hard drive? If so, how much?
Ratchet information
To answer this question, previous researchers imagined a ratchet that moves in one direction along an “information tape,” interacting with one bit at a time. As it does so, the ratchet modifies the tape’s statistical properties. That “tape” could be the hard drive in your computer or could be a sequence of base pairs in a strand of DNA.
“In this situation, by scrambling an initially ordered tape, we can actually extract heat from the environment, but only by increasing randomness on the tape.” He explained. While the second law still holds, it is modified. “The randomness of the information in the tape is itself a form of entropy and we can reduce the entropy in our thermal environment as long as we sufficiently increase it in the tape.”
In the literature, the laws bounding this behavior are termed information processing second laws, in reference to their explicit accounting for information processing (via modifying the tape) in the second law of thermodynamics. In this new paper, Semaan and co-authors uncover an “information processing first law,” a similar modification to the first law of thermodynamics, which unifies and strengthens various second laws in the literature. It appears to do more, too: it offers a way to tighten those second laws to place stricter limits on the allowed behavior for systems which have “nonequilibrium steady states.”
Non-equilibrium steady state systems—our bodies, the global climate, and our computers are all examples—need to constantly absorb and dissipate energy, and so stay out of equilibrium, even in “steady” conditions (contrast a cup of coffee left out: its “steady” state is complete equilibrium with the room).
“It turns out,” said Semaan, “that in this case we must ‘pay the piper’: we can still scramble the tape to extract heat, but only if we do so fast enough to keep up with the non-equilibrium steady states.”
This uni-directional ratcheting mechanism may someday lead to engineering a device that harnesses energy from scrambling a hard drive. But first, beyond engineering difficulties, there is much left to understand about the mathematical, idealized limits of this behavior. In other words, we still have a ways to go, even “in theory.” There are plenty of remaining questions to address, the fodder for any theoretical physicist worth their salt.
Find the full story at the College of Science.