The Tartar Steppe (Buzzati), book review

Is it like “Waiting for Godot?” Is it Triste? It’s great, but what is it?

Triste.

In a past life, I have vague memories of learning French. And I learned there is a word in French that does not translate to English. Triste: the idea of sadness with an indefinable quality (I think).

I feel this, having finished this book. How I found this book, is now lost in my past history, some other book referenced this, and I thought at one point: “I must read.” now, having spend time on/off for 10 years (I first shelved this TO READ in 2012), I have come to the end of the book.

How do I feel? After a decade of pursuing the completion of reading?

Having binged my share of Netflix, HBO, Prime TV shows and movies, I can say that It is not an action flick. It is not a military triumph. So what is it?

I don’t really know.

It feels a little like “Waiting for Godot”: we are sitting, waiting for something, just around the corner, it peeks, and then disappears. Tantalizing.

I’m reading/listening to a different book: Mythos right now by Stephen Fry (hilarious), and Tantalus, one of the old gods, was punished by having tremendous thirst, and water, right THERE, just out of reach. Yes, it feels like that.

It feels a little like Ambivalence, with a capital A, for someone, indecisive, torn between two ideas, two worlds, two things one MUST do, and yet. Just hold, just one more day. Perhaps tomorrow. Like a person staring at their cigarettes, despising them, desiring them, not quite ready to quit smoking.

It feels a little like wanting, desiring, deserving.

How does he do it? Evoking these emotions, without speaking them. The words, the images, beautiful.

The action: a man joins the military. He is sent to the Fort, a desert frontier, facing the North, from where no one expects an attack to come. One hopes for glory, for action. Day after day, year after year, unexpected drama, tragedies, but the expected / not expected attack? Nothing. And when Something Looms, through the telescope, one believes: “it is too much to hope for.”

And the end? Realization, surprise, mostly at oneself.

What.

Writing about it really doesn’t do it justice. Will this review be oblique enough to evoke the mood, the triste, the indirection of the narrative? You must decide.

The Classics: what Copernicus teaches us about dogma vs revolution

I am a collection of contradictions. I alternate between hedonistic Netflix binges and an occasional classic book (Ptolemy, Copernicus, Kepler). What contradictions do you embody, and what has it taught you?

Son brought this book home from his sophomore year at St. John’s College in Santa Fe. You know the school has its bona fide’s when their faculty have written, edited and published the definitive modern editions of these classic books.

This was a fun read. I started with Ptolemy, and the geocentric worldview he championed about 40 CE, and followed some of the geometry proofs (having studied Euclid too many years ago). I watched him build his ‘epicycles’ (circles upon circles) to explain the progression and retrograde motions of the 5 visible planets.

Retrograde: here’s a bonus term for those of us no longer into astrology: it is the apparent reversal of the motion of planets gradually migrating across the sky. From the newspaper, I recall reading statements like ‘Mars is retrograde in Capricorn’ and thinking ‘what hocus pocus is this?!’ Now I know that it means Mars is migrating in reverse direction and it is positioned inside the constellation of Capricorn in the sky. It makes a little more sense descriptively, but I still don’t buy that astrology foretells my future as a result.

Clearly, says Ptolemy, the only rational view of the heavens are that they are concentric spheres. The proof is that with his excellent math, he can predict the positions of the stars into the future.

1400 years later, Copernicus thinks deeply and has an insight (how, though, did he have that insight?). I love books, like Lavoisier’s on Chemistry, where the author lays out his stepwise experiments and failures, and gradual reasoning toward the invention of the concepts of oxygen and hydrogen…

Nevertheless, Copernicus lays out the groundbreaking and heretical thought that the Earth is not the center, and that the sun is the ‘center of the world.’ Heliocentrism. He has to tread gently, as he is shattering what has been accepted by scholars (and the Church) for over one thousand years. (What do WE accept that has been true for 1000 years?)

In his introduction Copernicus gingerly states that both he and Ptolemy have explanatory systems that predict planetary motion successfully. But, the heliocentric model is SO MUCH SIMPLER.

So, how to explain why Mercury and Venus have a much smaller range as they move across the sky night after night, and Mars, Jupiter and Saturn seem instead to span the entire sky? Ptolemy has to jump thru hoops and invent weird epicycles to make sense of this, and yet Copernicus sweeps away all that complexity by saying Earth is itself a planet with an orbit BETWEEN Venus and Mars. Thus the ‘inner planets’ behave differently from the ‘outer planets’ based on Earths own orbit and ‘overtaking’ the orbits of those outer planets. And all the epicycles fall away. It is one of the most ‘ohhh’ beautiful moments in the history of science.

Devastatingly, he draws out what Ptolemy’s epicycles requires planets to do (spirographic pirouettes through space, ridiculous when you look at it that way).

Cool read, even if this aging person is too lazy to revisit Euclid and plow through the math. The thinking, the paradigm shifting, the worldview challenging. That’s fun. Next up, Kepler and Galileo.

CMIO’s take? The world makes sense until someone comes along to up-end it. The instigator proposing the change often is faced with the sharp end of sticks and pitchforks. As Machiavelli told us, change is so difficult because proponents of any change are, at best, lukewarm, while detractors have ALL THE PASSION IN THE WORLD. Copernicus faced this. I have faced this as well, by developing APSO notes, and advocating for Open Notes and Open Results. Whatever change you’re working on, take heart. We tread this road together.

Is listening to an audiobook really reading? (NYtimes)

I take this critique very personally, as about 2/3 of the books I consume are via listening. How about you?

from WIRED.com

https://www.wired.com/story/is-listening-to-audiobooks-really-reading/

Readers: where do you fall on the reading-with-eyes vs listening-to-audiobooks spectrum? I’m not even going to tackle the eReader vs paper book divide.

Can’t book readers just get along?

TL;DR? Do what you like; reading a book can be about enjoyment, or learning something, or developing empathy. If it meets your goals, do it!

Where do you keep the (informatics) pixie dust? (borrowed from NYTimes)

This is hilarious: angsty flowcharts to help guide readers. Must-read article.

Fawzy Taylor, social media and marketing manager of the bookstore: A Room of One’s Own, Madison, WI, via NYtimes “These Memes Make Books More Fun”

Thank you to Fawzy Taylor, whose brainchild this is. Fantastic in so many ways.

Why can’t we build our informatics and our internal education this way? For example, for newbie informaticists, how about my book-recs graphic above, based on the same idea?

CMIO’s take? What do you think of the graphic? of the style? of the content? Guess what? It doesn’t matter, if it gets us talking!

Steven Pinker Thinks Your Sense of Imminent Doom Is Wrong – The New York Times

Steven Pinker image from wired.com

“It is irrational to interpret a number of crises occurring at the same time as signs that we’re doomed.”
— Read on www.nytimes.com/interactive/2021/09/06/magazine/steven-pinker-interview.html

The Xenobot Future Is Coming—Start Planning Now (wired.com)

“…the ability to recode cells, de-extinct species, and create new life forms will come with ethical, philosophical, and political challenges”

https://www.wired.com/story/synthetic-biology-plan/

With CRISPR, the molecular scissors technology ,we are gaining not only read, but WRITE access to our genetic data. Writing code will no longer be limited to computers (and electronic health records), but into living organisms. Are we ready? The technology is racing ahead of our ability to think about and deploy it for the good of all.

Can Learning Machines Unlearn? (wired.com)

https://www.wired.com/story/machines-can-learn-can-they-unlearn/

How much data?

I’ve been thinking about this a lot. In our recent work designing predictive algorithms using linear regressions and neural networks, and similar approaches, we’ve discussed the use of EHR (electronic health record) data, and have had some success using such algorithms to reduce deaths from sepsis (blog post from 10/6/2021).

One of many problems, is “how much data?” And it has been interesting to work with our data science colleagues on creating a model, and then carefully slimming it down so that our models can run on smaller data sets, more efficiently, more quickly, with less computing power.

Forgetting?

A related problem is “when do we need to forget?” EHR data ages, the way clinicians record findings can change. Our understanding of diseases change. The diseases themselves change. (Delta variant, anyone?)

Will our models perform worse if we use data that is too old? Will they perform better because we gave them more history? Do our models have an “expiration date?”

The Wired.com article above talks about having to remove data that was perhaps illegally acquired, or perhaps after a lawsuit, MUST be removed from a database that powers an algorithm.

Humans need to forget. What about algorithms?

Isn’t human memory about selective attention, selective use of memory? Wouldn’t a human’s perfect memory be the enemy of efficient and effective thinking? I’ve read that recalling a memory slightly changes the memory. Why do we work this way? Is that better for us?

Is there a lesson here for what we are building in silico?

CMIO’s take? As we build predictive analytics, working toward a “thinking machine”, consider: what DON’T we know about memory and forgetting? Are we missing something fundamental in how our minds work as we build silicon images of ourselves? What are you doing in this area? Let me know.

A picture of change (and inspiration for informatics. NYTimes)

From the Metropolitan Museum of Art via NYTimes: a Japanese Print to teach us about the modern world

The artistry in our journalism can be remarkable. Spend a few minutes zooming in and out of this Japanese print with Mr. Farago. It is inspiring and completely engrossing.

From an informatics perspective, can we take an EHR screenshot, and zoom in and out as entertainingly? Could we =gasp= make learning about EHR’s as engaging as an art exhibit?

James Webb telescope: astounding science and engineering (wired)

Zero Kelvin! Lagrange Points! Infrared parabolas! Light years and Time Travel! wtf?! We are living in the future.

https://www.wired.com/story/the-james-webb-space-telescope-is-in-position-now-its-booting-up/

If you have not been following the journey of the James Webb telescope, here is your chance to catch up. TL;DR: it is going well and in a few months we can look forward to astounding images from further away than ever before, and from further back in time than ever before. I can’t wait. Read the nice summary article from Wired.com, above.