blogjou
It seems like at least the European CORONA crisis is coming to an end, so I need another socially accepted excuse for never being around anywhere. A blog!The accent of belatedness...
The accent of belatedness is cought and held perfectly, in what we finally see is the saddest kind of love lyric, one that memorializes only a drem of youth.
Self-improvement is a large enough project...
Self-improvement is a large enough project for your mind and spirit: there are no ethics of reading. The mind should be kept at home until its primal ignorance has been purged; premature excursions into activism have their charm, but are time-consuming, and for reading there will never be enough time.
Nostalgia for lost illusions...
Nostalgia for lost illusions, loves that never quite were, happiness perhaps only tasted - these are the emotions Calvino evokes. In Isidora, one of the Cities of Memory, “the foreigner hesitating between two women always encounters a third,” but alas you can arrive at Isidora only in old age.
More than a comedian of genius...
More than a comedian of genius, [Flannery O’Connor] had also the penetrating insight that religion for her countrymen and -women was not the opiate, but rather the poetry of the people.
Maupassant has learned...
Maupassant has learned from his teacher, Flaubert, that “talent is a prolonged patience” at seeing what others tend not to see.
We are indeed already in the "inferno of the living"...
We are indeed already in the “inferno of the living”. We can accept it, and so cease to be conscious of it. But there is a better way, and it might be called the wisdom of Italo Calvo:
…seek and learn to recognize who and what, in the midst of inferno, are not inferno, then make them endure, give them space.
Calvino’s advice tells us again how to read and why: be vigilant, apprehend and recognize the possibility of the good, help it to endure, give it space in your life.
In major short stories...
In major short stories, reality becomes fantastic and phantasmagoria becomes disconcertingly mundane. That may be why so many readers, these days, shy away from volumes of stories, and purchase novels instead, even when the stories are of much higher quality.
Short stories favor the tacit, they compel the reader to be active, and to discern explanations that the writer avoids. The reader, as I have said before, must slow down, quite deliberately, and start listening with the inner ear. Such listening overhears the characters, as well as hearing them; think of them as your characters, and wonder at what is implied, rather than told about them. Unlike most figures in novels, their foregrounding and postgrounding are largely up to you, utilizing the hints subtlely provided by the writer.
Frank O'Connor, who disliked Hemmingway...
Frank O’Connor, who disliked Hemmingway as much as he liked Chekhov, remarks in The Lonely Voice that Hemmingway’s stories “illustrate a technique in search of a subject,” and therefore become a “minor art.”
Calculating the effective permeability of sandstone with multiscale lattice Boltzmann/finite element simulations
When scaling up from microscale to macroscale, often one is not interested in a single global value such as a mean only, but rather in the variation of a continuum variable. The authors define a representative elementary volume (REV) at the microstate-macrostate boundary. Inside the REVs, a lattice Boltzmann (LB) method is used to compute the microdynamics. The result per REV is then used on global scale to solve global dynamics by a finite elements (FE) method.
I see a close link to Dynamic upscaling of decomposition kinetics for carbon cycling models. In the referenced paper, Eq. (16) describes the macroscale dynamics corresponding to FE. On macroscale the parameters $\sigma^2_{C_s}$, $\sigma^2_{C_b}$, $C’_s C’_b$, etc. are used and those can be obtained from microscale dynamics corresponding to LB.
Quantifying entropy using recurrence matrix microstates
The authors introduce a complexity measure for nonlinear time series data that bases on the reccurence plot (RP) and the Shannon information entropy of its microstates. This complexity measure is easy and efficient to compute and approximates the maximum Lyapunov exponent of the data. It can also be used to discriminate between deterministic, chaotic, and stochastic data.
For $x_i\in\R^d$, $i=1,\ldots,K$, the RP is a matrix $\tens{R}=(R_{ij})_{ij}$ given by
\begin{equation} \nonumber R_{ij} = \begin{cases} 1,\, & |x_i-x_j| \lt \varepsilon,\newline 0,\, & |x_i-x_j| \geq \varepsilon, \end{cases} \end{equation} where $\varepsilon>0$ is called vicinity threshold. Diagonal structures of $1$s parallel to the main diagonal display recurrence patters and are signs for determinism. iThe idea here is now to fix a small natural number $N$, typically $N\in\{1,2,3,4\}$, and look at ($N\times N)$-submatrices of $\tens{R}$. A fixed number $\bar{N}$ of such structures is selected randomly. The total number of possible microstates is $N^\ast=2^{(N^2)}$ and with $P_i=n_i/\bar{N}$, where $n_i$ is the number of occurences of the specific microstate $i$, we get the entropy
\begin{equation} \nonumber S(N^\ast) = -\suml_{i=1}^{N^\ast} P_i\,\log P_i. \end{equation}
Although $N^\ast$ grows quickly as a function of $N$, usually just a small number of microstates are effectively populated. So, the effective set of microstates needed to compute adequately the entropy can be populated by just $\bar{N}$ random samples obtained from the recurrence matrix, and a fast convergence is expected. In general, we found that usually $\bar{N} \ll N^\ast$ for $N > 3$ such that $\bar{N} \sim 10,000$ is enough. This makes the method extremely fast even for moderate values of microstate sizes $N$. This observation also points out that a microstate size $N = 4$ is sufficient for many dynamical and stochastic systems.
The maximum entropy occurs when all microstates are equally likely, i.e. $P_i=1/N^\ast$, and is given by
\begin{equation} \nonumber S(N^\ast) = N^2\,\log2 . \end{equation}
The closer $S(N)$ is to $S(N^\ast)$, the more stochastic are the data.