Tuesday, October 31, 2017

Links + coming soon

Real life is a bit busy right now, but I wanted to point out a couple of links and talk about what's coming up.
  • I've been looking for ways to think about and discuss topological materials that might be more broadly accessible to non-experts, and I found this paper and videos like this one and this one.  Very cool, and I'm sorry I'd missed it back in '15 when it came out.
  • In the experimental literature talking about realizations of Majorana fermions in the solid state, a key signature is a peak in the conductance at zero voltage - that's an indicator that there is a "zero-energy mode" in the system.  There are other ways to get zero-bias peaks, though, and nailing down whether this has the expected properties (magnitude, response to magnetic fields) has been a lingering issue.  This seems to nail down the situation more firmly.
  • Discussions about "quantum supremacy" strictly in terms of how many qubits can be simulated on a classical computer right now seem a bit silly to me.  Ok, so IBM managed to simulate a handful of additional qubits (56 rather than 49).  It wouldn't shock me if they could get up to 58 - supercomputers are powerful and programmers can be very clever.  Are we going to get a flurry of news stories every time about how this somehow moves the goalposts for quantum computers?    
  • I'm hoping to put out a review of Max the Demon and the Entropy of Doom, since I received my beautifully printed copies this past weekend.

Wednesday, October 25, 2017

Thoughts after a NSF panel

I just returned from a NSF proposal review panel.  I had written about NSF panels back in the early days of this blog here, back when I may have been snarkier.

  • Some things have gotten better.  We can work from our own laptops, and I think we're finally to the point where everyone at these things is computer literate and can use the online review system.  The program officers do a good job making sure that the reviews get in on time (ahead of the meeting).
  • Some things remain the same.  I'm still mystified at how few people from top-ranked programs (e.g., Harvard, Stanford, MIT, Cornell, Cal Tech, Berkeley) I see at these.  Maybe I just don't move in the right circles.  
  • Best quote of the panel:  "When a review of one of my papers or proposals starts with 'Author says' rather than 'The author says', I know that the referee is Russian and I'm in trouble."
  • Why does the new NSF headquarters have tighter security screenings that Reagan National Airport?  
  • The growth of funding costs and eight years of numerically flat budgets has made this process more painful.  Sure looks like morale is not great at the agency.  Really not clear where this is all going to go over the next few years.  There was a lot of gallows humor about having "tax payer advocates" on panels.  (Everyone on the panel is a US taxpayer already, though apparently that doesn't count for anything because we are scientists.)
  • NSF is still the most community-driven of the research agencies. 
  • I cannot overstate the importance of younger scientists going to one of these and seeing how the system works, so you learn how proposals are evaluated.




Monday, October 23, 2017

Whither science blogging?

I read yesterday of the impending demise of scienceblogs, a site that has been around since late 2005 in one form or other.  I guess I shouldn't be surprised, since some of its bloggers have shifted to other sites in recent years, such as Ethan Siegel and Chad Orzel, who largely migrated to Forbes, and Rhett Allain, who went to Wired.  Steinn SigurĂ°sson is going back to his own hosted blog in the wake of this.

I hope this is just indicative of a poor business model at Seed Media, and not a further overall decline in blogging by scientists.  It's wonderful that online magazines like Quanta and Aeon and Nautilus are providing high quality, long-form science writing.  Still, I think everyone benefits when scientists themselves (in addition to professional science journalists) carve out some time to write about their fields.



Friday, October 20, 2017

Neutron stars and condensed matter physics

In the wake of the remarkable results reported earlier this week regarding colliding neutron stars, I wanted to write just a little bit about how a condensed matter physics concept is relevant to these seemingly exotic systems.

When you learn high school chemistry, you learn about atomic orbitals, and you learn that electrons "fill up" those orbitals starting with the lowest energy (most deeply bound) states, two electrons of opposite spin per orbital.  (This is a shorthand way of talking about a more detailed picture, involving words like "linear combination of Slater determinants", but that's a detail in this discussion.)  The Pauli principle, the idea that (because electrons are fermions) all the electrons can't just fall down into the lowest energy level, leads to this.  In solid state systems we can apply the same ideas.  In a metal like gold or copper, the density of electrons is high enough that the highest kinetic energy electrons are moving around at ~ 0.5% of the speed of light (!).  

If you heat up the electrons in a metal, they get more spread out in energy, with some occupying higher energy levels and some lower energy levels being empty.   To decide whether the metal is really "hot" or "cold", you need a point of comparison, and the energy scale gives you that.  If most of the low energy levels are still filled, the metal is cold.  If the ratio of the thermal energy scale, \(k_{\mathrm{B}}T\) to the depth of the lowest energy levels (essentially the Fermi energy, \(E_{\mathrm{F}}\) is much less than one, then the electrons are said to be "degenerate".  In common metals, \(E_{\mathrm{F}}\) is several eV, corresponding to a temperature of tens of thousands of Kelvin.  That means that even near the melting point of copper, the electrons are effectively very cold.

Believe it or not, a neutron star is a similar system.  If you squeeze a bit more than one solar mass into a sphere 10 km across, the gravitational attraction is so strong that the electrons and protons in the matter are crushed together to form a degenerate ball of neutrons.  Amazingly, by our reasoning above, the neutrons are actually very very cold.  The Fermi energy for those neutrons corresponds to a temperature of nearly \(10^{12}\) K.  So, right up until they smashed into each other, those two neutron stars spotted by the LIGO observations were actually incredibly cold, condensed objects.   It's also worth noting that the properties of neutron stars are likely affected by another condensed matter phenomenon, superfluidity.   Just as electrons can pair up and condense into a superconducting state under some circumstances, it is thought that cold, degenerate neutrons can do the same thing, even when "cold" here might mean \(5 \times 10^{8}\) K.

Sunday, October 15, 2017

Gravitational waves again - should be exciting

There is going to be a big press conference tomorrow, apparently to announce that LIGO/VIRGO has seen an event (binary neutron star collision) directly associated with a gamma ray burst in NGC 4993.  Fun stuff, and apparently the worst-kept secret in science right now.  This may seem off-topic for a condensed matter blog, but there's physics in there which isn't broadly appreciated, and I'll write a bit about it after the announcement.

Tuesday, October 10, 2017

Piezo controller question - followup.

A couple of weeks ago I posted:

Anyone out there using a Newport NPC3SG controller to drive a piezo positioning stage, with computer communication successfully talking to the NPC3SG?  If so, please leave a comment so that we can get in touch, as I have questions.

No responses so far.  This is actually the same unit as this thing:
https://www.piezosystem.com/products/piezo_controller/piezo_controller_3_channel_version/nv_403_cle/

In our unit from Newport, communications simply don't work properly.  Timeout problems.  The labview code supplied by Newport (the same code paired with the link above) has these problems, as do many other ways of trying to talk with the instrument.  Has anyone out there had success in using a computer to control and read this thing?   At issue is whether this is a hardware problem with our unit, or whether there is a general problem with these.  The vendor has been verrrrrrrrry slow to figure this out.

Sunday, October 08, 2017

The Abnormal Force

How does the chair actually hold you up when you sit down?  What is keeping your car tires from sinking through the road surface?  What is keeping my coffee mug from falling through my desk?  In high school and first-year undergrad physics, we teach people about the normal force - that is a force that acts normal (perpendicular) to a surface, and it takes on whatever value is needed so that solid objects don't pass through each other.

The microscopic explanation of the normal force is that the electrons in the atoms of my coffee mug (etc.) interact with the electrons in the atoms of the desk surface, through a combination of electrostatics (electrons repel each other) and quantum statistics (the Pauli principle means that you can't just shuffle electrons around willy-nilly).  The normal force is "phenomenological" shorthand.  We take the observation that solid objects don't pass through each other, deduce that whatever is happening microscopically, the effect is that there is some force normal to surfaces that touch each other, and go from there, rather than trying to teach high school students how to calculate it from first principles.  The normal force is an emergent effect that makes sense on macroscopic scales without knowing the details.  This is just like how we teach high school students about pressure as a useful macroscopic concept, without actually doing a statistical calculation of the average perpendicular force per area on a surface due to collisions with molecules of a gas or a liquid.  

You can actually estimate the maximum reasonable normal force per unit area.  If you tried to squeeze the electrons of two adjacent atoms into the volume occupied by one atom, even without the repulsion of like charges adding to the cost, the Pauli principle means you'd have to kick some of those electrons into higher energy levels.  If a typical energy scale for doing that for each electron was something like 1 eV, and you had a few electrons per atom, and the areal density of atoms is around 1014 per cm2, then we can find the average force \(F_{\mathrm{av}}\) required to make a 1 cm2 area of two surfaces overlap with each other.   We'd have \(F_{\mathrm{av}} d \sim 10^{15}\)eV, where \(d\) is the thickness of an atom, around 0.3 nm.   That's around 534000 Newtons/cm2, or around 5.3 GPa.   That's above almost all of the yield stresses for materials (usually worrying about tension rather than compression) - that just means that the atoms themselves will move around before you really push electrons around.

Very occasionally, when two surfaces are brought together, there is a force that arises at the interface that is not along the normal direction.  A great example of that is in this video, which shows two graphite surfaces that spontaneously slide in the plane so that they are crystallographically aligned.  That work comes from this paper.

As far as I can tell, there is no official terminology for such a spontaneous in-plane force.  In the spirit of one of my professional heroes David Mermin, who coined the scientific term boojum, I would like to suggest that such a transverse force be known as the abnormal force.  (Since I don't actually work in this area and I'm not trying to name the effect after myself, hopefully the barrier to adoption will be lower than the one faced by Mermin, who actually worked on boojums :-)  ).

Tuesday, October 03, 2017

Gravitational radiation for the win + communicating science

As expected, LIGO was recognized by the Nobel Prize in physics this year.  The LIGO experiment is an enormous undertaking that combines elegant, simple theoretical ideas; incredible engineering and experimental capabilities; and technically virtuosic numerical theoretical calculations and data analysis techniques.  It's truly a triumph.

I did think it was interesting when Natalie Wolchover, one of the top science writers out there today, tweeted:   Thrilled they won, thrilled not to spend this morning speed-reading about some bizarre condensed matter phenomenon.

This sentiment was seconded by Peter Woit, who said he thought she spoke for all science journalists.

Friendly kidding aside, I do want to help.  Somehow it's viewed as comparatively easy and simple to write about this, or this, or this, but condensed matter is considered "bizarre".  

Sunday, October 01, 2017

Gravitational radiation redux + Nobel speculation

This past week, there was exciting news that the two LIGO detectors and the VIRGO interferometer had simultaneously detected the same event, a merger of black holes estimated to have taken place 1.6 billion lightyears away.  From modeling the data, the black hole masses are estimated at around 25 and 30 solar masses, and around 2.7 solar masses worth of energy (!) was converted in the merger into gravitational radiation.  The preprint of the paper is here.  Check out figure 1.  With just the VIRGO data, the event looks really marginal - by eye you would be hard pressed to pick it out of the fluctuating detector output.  However, when that data is thrown into the mix with that from the (completely independent from VIRGO) detectors, the case is quite strong.

This is noteworthy for (at least) two reasons.  First, there has been some discussion about the solidity of the previously reported LIGO results - this paper (see here for a critique of relevant science journalism) argues that there are some surprising correlations in the noise background of the two detectors that could make you wonder about the analysis.  After all, the whole point of having two detectors is that a real event should be seen by both, while one might reasonably expect background jitter to be independent since the detectors are thousands of miles apart.  Having a completely independent additional detector in the mix should be useful in quantifying any issues.  Second, having the additional detector helps nail down the spot in the sky where the gravitational waves appear to originate.  This image shows how previous detections could only be localized by two detectors to a band spanning lots of the sky, while this event can be localized down to a spot spanning a tenth as much solid angle.    This is key to turning gravitational wave detectors into serious astronomy tools, by trying to link gravitational event detection to observations across the electromagnetic spectrum.  There were rumors, for example, that LIGO had detected what was probably a neutron star collision (smaller masses, but far closer to earth), the kind of event thought to produce dramatic electromagnetic signatures like gamma ray bursts.

On that note, I realized Friday that this coming Tuesday is the announcement of the 2017 Nobel in physics.  Snuck up on me this time.  Speculate away in the comments.  Since topology in condensed matter was last year's award, it seems likely that this year will not be condensed matter-related (hurting the chances of people like Steglich and Hosono for heavy fermion and iron superconductors, respectively).  Negative index phenomena might be too condensed matter related.   The passing last year of Vera Rubin and Debra Jin is keenly felt, and makes it seem less likely that galactic rotation curves (as evidence for dark matter) or ultracold fermions would get it this year.  Bell's inequality tests (Aspect, Zeilinger, Clauser) could be there.   The LIGO/VIRGO combined detection happened too late in the year to affect the chances of this being the year for gravitational radiation (which seems a shoe-in soon).