Principle
If the universe as a whole can be considered as a computational fabric on which matter is organized to allow for computations to be efficiently done (cf Beliefs#B1), then the choice of which to conduct is of fundamental importance.
Those political decisions can be partly witnessed in the history of large computations at the scale of nations (hence the dedicated section on this page) but also through individual decision leveraged by the private sector, typically 2-sided market as explained in InformationRules (still to develop).
Overall allocation of resources is strategical and have fundamental social implication. Computation, despite the famous adage that "knowledge is power", have rarely so far been considered as a resource per se, yet it might be one of the most important one around which others revolve.
This should be use as a part of ways to be politically more efficient, e.g. with VotingIsALearningProcess.
Content
- patterns
- mostly only information about the technical specification of the supercomputers but hardly about the computations they ran
- pushing structured matter around (final product)
- pushing information to locally build the final product around
- pre-computations, distributed computation, ...
- handling increasingly large datasets (thus endless race)
- e.g. manipulating no genome then 1 genome with difficulty then currently doing work with hundreds or whole taxa
- economy of scale, specialization/standardization/distribution
- at the political level, who decides how supercomputers will be allocated?
- how does the hierarchy influences such choices?
- how does the underlying ideology influences such choices?
- publicly funded classics
- weather
- high energy with no foreseeable exploitation means
- cosmological simulations
- defense
- note that Agnotology could help distinguish a pattern from the history of declassification
- private sector classics
- energy e.g. drilling simulation for B.P., ...
- finance e.g. JP Morgan
- mass end-user information e.g. Google, Facebook, ...
- large computations
- what is currently the algorithm currently consuming the most CPU cycles? the most energy? (see also Seedea:Content/Newconcepts#FERD )
- interesting measure also to compare distributed computations, e.g. comparing BitcoinFabelier with the Top500 or http://wiki.apache.org/hadoop/PoweredBy
- which algorithms a society has historically invested the most energy on
- and related hypothesis e.g.
- Is there a constant shared dedicated to cryptoanalysis?
- is there a book on the history of large computations? (ideally cryptography and cryptoanalysis included), not the history of calculus, computation or automation but rather on using those tools and methods for a significant period of time thus using a large amount of available resources that would have otherwise been used for other important task (thus being a political question)
- have humanity overall always spent an equivalent or increasing share of its resources on computations?
- consider using Seedea:Content/Newconcepts#FERD and as a ratio of total available FERD per period
- recall Wikipedia:Lewis Fry Richardson#Weather_forecasting in which a room of computers was then filled with people
- see StructuralInformationAsymmetries#ArchitectureIsomorphisms on patterns in space rather than in time or the impact off Cognition#HistoryOfDominantCognitiveModels regarding what problems are actually trying to be solved
- see MecaMind
- consider a DBPedia request using the
Purpose
in Categories: Supercomputers
(added to SemanticWeb#Requests)
- note that those lists are mainly "just" public research computations, privates (e.g. currently Facebook, Google, etc) not distributed individuals (e.g. Folding@Home, BitCoin, etc) are not included even though individually they may not be relevant, there sum might be significant
- restricted to science?
- different programs running and with size of share
- In graphics: Supercomputing superpowers, BBC News May 2010
- including treemap by applications
- history of the largest computations
- consider the link with the evolution of Mathematics#ComplexityTheory and what is calculable, what is not
- In graphics: Supercomputing superpowers, BBC News May 2010
- military battle
- what was computed before? what outcome did it result it?
- map, estimation of enemy resource, ...
- note that the energy required to run might not be the most important, cf WithoutNotesSeptember11#WatsonEnergyConsumptionEstimate which would seem to show that it is actually potentially negligible!
Financing such a study
- How many "rich" history inclined geeks Financial#crowdfunding principle to the point of donating?
- potential bonus for helpers
- shipped paper copy from Lulu.com
- shipped poster of a summing-up infographic
- dedicated visit of a significant place (e.g. Computer History Museum)
Inspired by
See also
- StructuralInformationAsymmetries#ArchitectureIsomorphisms on the resulting similarities between seemingly different structure
- history of computing devices
- Seedea:Research/Drive in particular for military simulations
- Dag Spicer of the research department of the Computer History Museum answered to my 2nd of September 2011 request
- Textbooks on HPC (High Performance Computing); F90 (Fortran 90) applications
- journals related to supercomputing applications, e.g. http://ftp.math.utah.edu/pub/tex/bib/toc/ijsa.html
- most interesting ones are possibly proprietary (i.e. secret): certainly government work but also commercial code (such as is written "in-house", for e.g. Boeing or Ford.)
- instances
- sources to explore
- for each government there is a commitee dedicated to allocate which at UCL or Kings College can use the latest supercomputer paid by the government and to do so Im sure they keep track records of what has been done and what is most likely to make serious progress, somebody who has been part of such commitee and know how they worked, beside the financia accounting, could surely give some interesting pointers
- for each supercomputer who bought it, what was the justification, what algorithm was developped just for it... really tedious and imprecise
- for each supercomputer brand there are salespersons would similar information prepared for marketing arguments
- cf History of Computation Course Notes by Paul E. Dunne via PersonalInformationStream#Scanning
- discussion with Marc after ParadigmShiftMeetings2011
- compare comparable events
- e.g. army-related computations per century
- sources
- military, especially interesting because of the adversarial aspect, thus requiring smallest possible time frame
- history at the university of Montpellier
- Fort de Vincennes
- finance
- Venise as the first financial hub
- high frequency trading today
- administration, less important time constraint yet very important quantities
- Colbert
- douanes/customs for international trade
- http://www.quora.com/Could-LAPACK-be-the-library-which-used-the-most-CPU-cycles-total
- Allocations Past Awards at SDSC from 1999 to 2005
- Active XSEDE Allocations
- consider asking on http://www.reddit.com/r/AskHistory/ and http://www.reddit.com/r/AskHistorians/