It starts with this post about temporal cloaking which I seem to have encountered about two months too late. It describes how a team at Cornell has developed a mechanism to slow time around an event, creating a bubble of spacetime around the event and thus making it seem as though it never happened. Another post which I just discovered explains a few more theoretical details about the phenomenon, this time from the Imperial College of London. You can find the original paper here.
Reading through the paper, it doesn’t sound very impressive (but then again which dry scientific paper ever does?) especially because the maximum time they could temporally cloak anything turned out to be 110 ns; in theory, that could go up to 120 microseconds.
The number of sci-fi scenarios that could arise from this literally blow my mind. I was thinking about some of this stuff when I encountered a commenter’s thought that made my ideas far more concrete: this might be new technology, but imagine applying it to the screening of signals. Data is encoded in signals, and you don’t have to make all data vanish for it to have an impact on the veracity or accuracy of your information.
Just think about that for a second.
And after that V asked me what the hell the point of 110 ns as any kind of measure was, because it’s not immediately obvious what that does. So I tried to create a scenario for her.
- The average PC processor these days operates at about 4 GHz (actually closer to 3.5, but whatever).
- That means that the period of a single clock cycle is about 0.25 nanoseconds.
- 110 ns translates into 440 cycles of 0.25 nanoseconds each.
- Now, consider the number of cycles it would take for a single instruction to execute.
Here’s where I hit a stumbling block. I’m supposed to know this sort of thing because I’m a computer engineering student, apparently, but a) I wasn’t particularly stellar, b) I have the worst memory and c) I’m very software right now. I crowdsourced the question on Facebook and was firmly put in place for asking such a ridiculously wishy-washy thing (my commenter was right, anyway — there are too many things involved in comp arch to every answer a question like that without knowing the entire, exact background).
Incidentally, I believe the DP512 we used in microcontroller classes operates at 16MHz, which is a far less impressive 62 ns clock speed. Not much help there.
But if you want to consider something as basic as a fetch-decode-read-execute, you’re thinking of at least 4 cycles, maybe ten if you factor in subrouting jumping and the different kinds of memory access… I think. I’m probably still simplifying this far too much into uC territory.
Of course now the question is why the hell I’m torturing both myself and others about this piece of technology. And the answer is: possibilities! Sci-fi! I’m not talking about cloaking entire people or underground cities cloaked in time or anything (although that would be fabulous). I’m talking about bits of data just carefully concealed. I swear to god DoD probably had massive funding in this project.
Here’s what I typed to T, while we were discussing the possibilities:
And what I typed to V:
Of course this is stupidity, in some ways, because it’s not likely they have a portable micro version of these lenses that they can simply insert where they want to. But here’s the thing about sci-fi — you have license to imagine and hope for the future.
Or possibly dread, if we’re talking about a totalitarian system that jacks into records to rewrite history and itself. 1984, guys — doubleplusungood.
Science: it’s like magic, except it works!
In other, almost just-as-startling news, have a look at the Crazy Camouflage Cephalopod: