This was a pretty interesting book about the data revolution in baseball. One of the main people it follows is Trevor Bauer, who as an Astros fan (and non-crazy person) I guess I’m obliged to dislike? But the stuff he’s done has been pretty interesting.
Honestly I should have liked this book more than I did – it covers a wide range of topics and it’s pretty well written – but I struggled to get through it. Maybe I’m getting tired of baseball books? But if you’re at all interested in modern baseball I’d recommend it.
The main idea of this book is that when kids are having behavioral issues, adults tend to see them as “top-down” where the child is choosing to misbehave for some reason. But often, what’s happening is “bottom-up”: the child has something deeper going on and it comes out via misbehavior, and to really fix the bad behavior you need to address the underlying problem. Just trying to give stickers for good behavior and punishing bad behavior isn’t going to help anything! (this is also not a huge surprise given the rewards book)
He uses a color shorthand to categorize a child’s state of arousal: green means the child feels safe and connected and able to learn; red means the child is like the “fight” in “fight or flight”, often with a rapid heartbeat, sweating, etc.; blue means the child feels in extreme danger and may have a slow heart rate and breathing rate.
There’s a lot of useful stuff if you have a child with a problem – since I was reading it just for information it was a bit tedious to get through. I think the main takeaway is that kids (especially young ones) just don’t have much control over their emotions and actions, and you have to help them by making them feel safe instead of expecting them to be able to do something they just can’t. Which is valuable!
I got clued in to Ted Chiang’s work by watching the movie Arrival, which is an amazing(*) movie based on the titular short story in this collection. And these are quite good sci-fi stories of just the kind I like – what if was true, how would it change the world? How would we react and adapt to it?
Quick rundown of some “awards”: – Best world-building: Tower of Babylon – Weirdest: Seventy-Two Letters (Part of the premise was neat and the other part was, umm, pretty weird!) – Most thought-provoking: Liking What You See: A Documentary
(*) – full disclosure, I watched it a few months after the birth of my daughter so I was pretty sleep-deprived, but I’m pretty sure it’s still very good!
OK! So I finally read(*) the Mueller report. I’ve seen some snarky takes about how you can’t trust Americans to read anything before. Folks, it’s 470 pages long and while it’s not full of legalese, it’s still not the easiest read. There are definitely some interesting parts but it was kind of a slog.
But, you don’t have to make my mistakes! If you want to read the most important stuff without slogging through the whole thing, here’s what I’d recommend: – Skip Volume 1 – Volume 1 is all about Russia’s interference in the presidential election and its interaction with the Trump campaign. There’s nothing terribly conclusive here, although if you’re interested you can read the executive summary (10 pages) which includes a summary about what Russia did. – Read Volume 2’s executive summary (10 pages) – Volume 2 is about obstruction of justice, and the executive summary briefly describes the 11(!) events that might be obstruction of justice. I don’t think there’s anything actually new here, but reading them all back to back had quite an effect on me. (also, kudos to the New York Times and Washington Post – basically everything they wrote about this stuff is confirmed here!) – Read Volume 2 Section 1A (5 pages)- this describes what is necessary for something to be obstruction of justice. Fun fact: it is not nearly as strict as I had thought! If you, say, tell your lawyer to fire the special counsel that’s investigating you, and he doesn’t do it, that can still be obstruction. Even if your conduct would otherwise be lawful, if your motive is improper that can still be obstruction.
Reading those two things back to back made me pretty darn convinced that the President committed obstruction of justice. You’re welcome to read more details about the stuff in Volume 2, or the long parts about the constitutional issues involved, but in retrospect it really wasn’t necessary.
Anyway, I’d really recommend you read at least those parts – I picked up a copy on my Nook for 99 cents, I’m sure there are ones for the Kindle, and there’s a free PDF if you’d rather read it on a bigger screen.
(*) – Fine, I actually skimmed the last 10% or so…
This book was a gift, and when I saw the author was a Fox News host I was quite confused. But he grew up in the UK, so he’s a European-style conservative.
And to his credit, there are some pretty good ideas in here! Some of my favorites are: more local (i.e. neighborhood) control over things that make sense, a national service program, more help/resources for parents, serious antitrust enforcement, getting rid of noncompete clauses, and a living wage(!).
There are also some less-good ideas: in particular he’s proudly a nationalist (not a white nationalist, mind you), and he was strongly pro-Brexit when he worked in the UK. And he’s enthusiastic about “green/brown” zoning, which means that land is either zoned for nature (like a park) or for any sort of development. (I don’t really know how to feel about that. Maybe it’s good?)
The book has almost a verbal tic about “elites” – it goes on and on about how elites are trying to keep you, the people, down, and it’s a bit excessive and maybe even a little dangerous.
One amusing running thread is that in the introduction he talks about how liberals and conservatives are both wrong, and only by looking for pragmatic solutions can we find things that will benefit the people. This is the kind of thing that sounds great but breaks down pretty quickly. For example, in the section about health care he says that – Democrats want universal coverage – Republicans want competition and consumer choice – There have been some scandals in England’s NHS (National Health Service); it’s not so great! – But actually people in England are very proud of the NHS despite its problems – And obviously people shouldn’t go bankrupt because of medical bills – So clearly the pragmatic middle ground is the government paying for healthcare, but the healthcare itself is provided by private doctors. And I was like…umm, great, but this is clearly a leftist/Democratic position! Have you met the modern Republican party? You should watch some of the other shows on your network!
Anyway, there are a lot of interesting ideas here and it was a fairly easy read. Would recommend, even though I don’t agree with all of his policy ideas.
As I’ve mentioned before, I have a bit of a fascination with airplane crashes, and several books I’ve read mentioned this one as a seminal work in describing how accidents in complex systems happen.
The main part of the book is setting up a system for categorizing systems. One dimension is “loosely-coupled” versus “tightly-coupled” – this roughly corresponds to how much slack there is in the system. A good example of a tightly-coupled system is an assembly line if parts are going down a conveyor belt or something – if something goes wrong to mess up a widget at one station, that widget will quickly be at the next station which can cause other problems.
The other dimension is “linear” versus “complex”, which roughly describes the interactions between parts of the system. An assembly line with a conveyor belt is a good example of a “linear” system because the interactions between the different stations are pretty predictable. Usually the more compact in space a system is, the more “complex” it is because lots of different parts of it are close together.
Tightly-coupled complex systems are prone to what the author calls “normal” accidents which aren’t really preventable. Basically, when a system is tightly-coupled you need to have a pretty strict plan for how to deal with things when something goes wrong, because you don’t have a lot of time for analysis or debate. (a military-like structure can help, although obviously this can have bad consequences for organizations that are not the military) But complex systems require more deliberation to figure out what’s actually going on and possibly more ingenuity to find a solution.
It’s interesting because in retrospect for each particular accident it’s usually easy to see what went wrong and what the people involved did wrong. (or what the organization did wrong before that point) The author’s point is that most of the time blaming the people involved is missing the point – these sorts of accidents are inevitable.
Most of the book is looking at specific systems (nuclear power plants, chemical plants, airplanes, marine shipping, dams, spacecraft, etc.), trying to categorize them, and looking at examples of accidents.
(I should point out that I’m grossly oversimplifying here…)
I think I mostly agree with his points, but I really don’t have the depth of experience to know how reasonable his approach is. The book was written just before Chernobyl (so the part about nuclear power plants seems prescient), but there’s also an afterward written in the late 90s about the Y2K problem and how maybe everything will be fine but there will likely be unpredictable serious problems, which didn’t pan out. So I dunno.
The book itself is pretty academic and was kind of a slog to get through even though I am interested in the topic.
This was a really interesting book about the Chernobyl disaster. I was hoping for details about why it happened, and while the book does go into that the explosion happens around page 100 of the book. (of 360 or so) But what happened afterwards was really interesting, too!
The reactor design at Chernobyl was a major cause of the accident. Nuclear reactors in the West used water as a coolant and as a moderator – if the reactor gets too hot, that turned more of the water to steam, which is a less effective moderator, so that slows the chain reaction down and can even stop it. This is known as a “negative void coefficient”, and means that if things get out of control the reactor will shut itself down thanks to physics. But the Chernobyl had a “positive void coefficient”, meaning that if the reactor gets too hot the chain reaction speeds up, and if the operators of the plant don’t do something to stop it a meltdown or explosion would occur.
The Russian physicists didn’t discover this until late during the construction of their first nuclear reactor of this kind. They tried to tweak the design but couldn’t fix this problem easily.
Another problem was that the emergency shutdown protocol was to drop a bunch of control rods to absorb a bunch of neutrons and stop the chain reaction. But they thought the impact of instantly shutting off all power would be too harsh on the Soviet power grid, so by design it took around twenty seconds for the control rods to complete their descent into the core.
In fact, this reactor type had already had two major accidents (one partial meltdown and one explosion), so the Soviet agency in charge was supposed to develop new safety regulations and a better emergency shutdown protocol. But none of that was ever done, probably because of the political pressure to get more reactors up and running. And in fact every nuclear accident was treated as a state secret (presumably because it was politically embarrassing), so even operators of other reactors weren’t told what had happened!
Shoddy construction (due to unrealistic deadlines) was yet another problem.
I was curious about how much of the accident’s cause was due to the Soviet political system, and it turns out a lot of it was. Despite the things I mentioned above, the designers did write a manual that included detailed instructions that probably would have been good enough to avoid problems, if operators followed them perfectly and exactly (which is itself unrealistic). But operators were used to bending the rules to get things done to meet their production targets already. And because of the whole secrecy aspect the manual didn’t emphasize which rules were actually necessary for safety.
Another takeaway from the book is that the plant operators didn’t even realize there had been a (giant!) explosion for a while. Some of this, it seems, was due to the usual “fog of war” around major accidents (an explosion in the reactor was basically unthinkable), but there seems to be some incompetence as well, or something. The book goes into detail about people in the town going about their day (the explosion happened around 1 AM on Saturday morning) and I spent around 100 pages yelling “Get out of there!”.
Other odds and ends: – The Soviet agency behind atomic weapons and nuclear reactors was named the “Ministry of Medium Machine Building” in order to to conceal its true work! – Fallout (radioactive dust) is very very hard to clean up, because it’s, well, dust. Spraying down streets, etc. with water helps but every time the wind picked up it would just get picked up and moved around more. – Gorbachev (the general secretary of the Communist Party at the time) wanted to come clean about what had happened to the world in line with his glasnost policy, but was overruled by the Party elders. In fact, people (Gorbachev included!) see the Chernobyl accident and ensuing coverup as a big factor in what brought down the Soviet Union down. – Because the radioactivity had spread to Europe relatively quickly, the West knew that something was up but not what had happened or how bad it was. A United Press International reporter in Moscow at the time talked to a Russian woman who said that two thousand people had died as a result of the explosion, and this made headlines everywhere. This was not in fact true, and a different reporter thought that the UPI reporter’s Russian was so bad he mistranslated what the woman had said! – One of the cleanup efforts involved helicopters dropping sand or clay on the now burning reactor to try to put out the fire. (this later turned out to be a Very Bad Idea(TM)) The helicopter pilots try to line their helicopter and seat with lead to protect themselves, and even had a catchy rhyme about it: “If you want to be a dad, cover your balls in lead”! (I’m assuming that rhymes in Russian or something) – The China Syndrome (based on the 1979 movie) saying that a reactor could melt down through the floor and the ground, all the way to China, is not true. But the Chernobyl reactor could have melted down through the floor and get into the river nearby, which would have poisoned the water that thirty million people used(!) Thankfully, this did not happen. – Another big worry was that all the material left in the reactor could have started a new nuclear chain reaction, emitting much more radiation. This also did not happen. – There was a bunch of radioactive debris on the roof of the building (there were three more reactors as a part of the same complex!), and technicians tried to deploy robots designed to work with radioactive material to clean it up. But the environment was too hostile for the robots – the radiation and gamma fields killed their electronics. So they used “bio-robots” – soldiers would run out onto the roof (with some protective gear), throw some debris through the hole in the roof, and run off. This would hit them with the maximum allowed radiation dosage, so they’d get a handshake and get to go home, and the next soldier would go in. It took over 3800 soldiers, but they got the job done. – 17.5 million people lived in the most seriously contaminated areas of Ukraine, and 696,000 had been examined by doctors, but the official death toll was 31. This was, of course, a gross understatement of the deaths the radiation caused. – In the end, 1800 square miles of Ukraine and Belarus were declared “officially uninhabitable”.