Home | Journal | Bookshelf | Index | Other | Summaries | Timeline | Help | Copyright
Previous | Next | Cards | Index | Cloud

Ross indexed the following pages under the keyword: "Entropy".

Previous | Next | Cards | Index | Cloud
Adaptation essential nature of
Entropy in physiology
Levels and alcohol
Stimulus essential property of
Dominance of receptors
Neuron gradient in
Receptors and gradients
Synapse phylogeny of
Waves in nerve impulse
0151 0152

Previous | Next | Cards | Index | Cloud
Entropy ? validity of
Intelligence basic of
Nirvanophilia basis of
Inhibition in retina
Integration (physiological) extract from Creed
Receptors integration in
Retina essential light of retina
Sensation integration in
0313 0314
Entropy ? validity of
Organisation and entropy
Reversible process and entropy
Time and entropy
0345 0346
Entropy ? validity of
Organisation Fulton on
0351 0352
Entropy ? validity of
Neutral point (of equilibrium - including 'cycle', 'region' etc.) moving
Organisation and entropy
Reversible process and entropy
Energy free energy
0359 0360

Previous | Next | Cards | Index | Cloud
Cortex spread of organisation
Entropy ? validity of
Organisation spread of
Equilibrium of a neuron
Equilibrium physical characteristics
Neuron as a stable unit
Neuron environment of
0589 0590

Previous | Next | Cards | Index | Cloud
Entropy and learning
Equilibrium and entropy paradox
Learning and entropy
Reversible process and learning
Organisation spread of
0813 0814

Previous | Next | Cards | Index | Cloud
Entropy and organisation
Organisation and entropy
Organisation two meanings
0879 0880
Entropy and sensory adaptation
Differential equation turned to operator
Group (mathematical) and machine
Isomorphism in machine
Operator = differential equation
Substitution (mathematical) as differential equation
0917 0918

Previous | Next | Cards | Index | Cloud
Entropy Schrödinger on
Natural Selection [41]: A gene has two survival-values, its chance of being destroyed by a competitor, and its chance of being destroyed by thermal agitation, 1913.
1913 1914

Previous | Next | Cards | Index | Cloud
Entropy calculation of
Information calculation of
Markov process / chain information from
Transducer theory of
3368 3369

Previous | Next | Cards | Index | Cloud
Summary: Judge the difficulty of "Design for a Brain" by the comprehension shown by the average worker in EEG. Adjust the difficulty so that almost all such workers show comprehension.
Energy notes on
Entropy in physics
Hover mouse here to display note
4090 4091
Summary: Entropy etc.
Entropy dimensions of
Set or Ensemble in experimenter
4094 4095
Entropy limit of
Transformation Monograph
Transformation Monograph
4346 4347

Previous | Next | Cards | Index | Cloud
Summary: Information that can come from a machine.
Graph definition
Machine falls in capacity
Epistemology [22]: Estimates of how fast knowledge can be got from a machine 4430.
Entropy and information
4430 4431
Entropy computing
Information computing
Summary: Computing entropy.
4444 4445
Summary: Formal proof that, if a set is stable, no state in it can lead ever to a state outside the set.
Summary: Power for adaptation. (See next page) 4583
Adaptation motive power for
Entropy as prime mover
4536 4537
Homeostasis variety is necessary
Regulation variety is necessary
Requisite Variety, Law of stated
Summary: Only variation can force variation down - exact conditions necessary. 4662, 4674, 4750
Entropy necessary for regulation
Output variety in
Variety necessary for regulation
4658 4659
Summary: Entropy and information.
Entropy as missing information
4666 4667

Previous | Next | Cards | Index | Cloud
Summary: Statistical laws in the Markov chain. 4919, 4946
Basin and 'closure'
Closure and 'basin'
Experience example
Aging (as process) and entropy
Entropy in physics
4918 4919
Summary: Effects of channel capacity in joining two systems, only one of which is observed.
Epistemology [49]: As the coupling between B and A is made richer, either by increasing channel capacity or by adding immediate effects, so will what is in B affect x the sooner. 5026. DIAGRAM
Entropy during search
Hunt and stick information flow
Information during hunt and stick
Searching information flow
Selection informatiom flow
Trial and error gives information
5026 5027

Previous | Next | Cards | Index | Cloud
Entropy and organisation, in cream
Organisation and entropy
Summary: The problem of the glass of milk.
5110 5111
Entropy zero entropy
Homeostasis as noise-correction
Noise homeostasis as anti-noise
Requisite Variety, Law of as noise-correction
Summary: Law of Requisite Variety as law for suppression of noise.
Feedback and persistance
Quantum theory and Black Box
5242 5243

Previous | Next | Cards | Index | Cloud
Summary: No regulator (other things being equal) can give performance better than the machine with input.
Entropy zero entropy
Markov process / chain of zero entropy
6160 6161

Previous | Next | Cards | Index | Cloud
Summary: The "mesa" phenomenon as decay of information. But see 6784
Entropy second law and heterogeneity
Thermodynamics second law
6644 6645
Entropy fall in state-determined-system
Constraint increase with time
Laws of nature increase with time
Pattern (in general) increase with time
Planet growth of law on
Steady state on planet, and law
Time makes law increase
6772 6773
Summary: When a mapping operates (a state-determined system) either: activity goes down or internal pattering increases. 6850, 6835. Examples: 6854
Summary: Under any (one) sequence of mappings either activity decreases or internal patterning increases. 6777
Convergence (of lines of behaviour) random
Entropy rate of fall
Mapping random, convergence
6774 6775
Summary: The Poisson distribution has entropy (just under) 1 bit less than the even distribution. 6780, 6814
Summary: Entropy of ½ ± y.
Entropy H(½+y,½-y)
6776 6777
Summary: Under a random mapping, the internal transmission will rise rapidly at first, as the distribution leaves the rectangular form; but it will soon stop increasing, and can be expected to show no further interesting degree of increase. 6802
Entropy Bent's book
Second Law (of thermodynamics) Bent's book
6784 6785
Summary: Another example of the relativity of memory.
Entropy non-exclusive events
6794 6795
Summary: How the deletion of one arrow from a full DIE (diagram of immediate effects) makes the number of possible mappings fall. 6884
Summary: Deletion of the DIE-arrows does not imply that the convergence must increase necessarily.
Entropy sampling distribution of
6806 6807

Previous | Next | Cards | Index | Cloud
Summary: (See previous note)
Summary: On consciousness.
Conscious mind proves physics insuffient
Entropy Conant's HL
Materialism inadequate
6924 6925

Previous | Next | Cards | Index | Cloud
Summary: Morowitz' magnificent book. 7053
Entropy expected value
Constraint permutation as
Permutation constraint necessary
Transmission for permutation
6982 6983
Summary: Communication required to keep a shoal of fish together. 7068, 7067, 7084
Entropy of Gaussian distribution
Summary: Entropy of a Gaussian distribution. 7069
7066 7067

Home | Journal | Bookshelf | Index | Other | Summaries | Timeline | Help | Copyright