Principle

have all my data integrated in an unifying "framework" (currently my wiki), not necessarily to spread out in different innovative cloud services, commercial or not.

PmGraphViz

Color scheme : phasing out, gearing up.

PmGraphViz

Color scheme : http, https.

See also Graphviz for an alternative view not based on infrastructure first by privacy.

cloud-init

  • working cloud-init configuration
  • write_files to make files e.g .env
  • generate it from a template then customize and load
    • e.g cp ci-template cloud-init; echo " - command" >> cloud-init

To try

Tried but didn't pursue

Tried but not fully functional

Running

Problems

  1. server mostly but not fully reboot proof and not format proof
    1. lighttpd OK, screen mostly OK (thanks to Screen stuff)
  2. laptop not format proof
  3. desktop not format proof
  4. backups not automated and Russian dolls format
    1. automatic rotating backups (cf rdiff-backup in Crontab)
  5. phone mostly mostly but not entirely format proof
    • e.g. notes

To use

Solution

  1. upload files
  2. remove them locally
  3. set up tools I might need to feel "just like at /home".
    1. OpenVPN (currently using ssh for tunneling, pscp, sshs)
    2. treemap (demo set up)
    3. rsync (currently using pscp)
  4. scan the uploaded data folders
    1. propose to upload to websites per mime type, file extension
      1. if API are complex provide links to do so manually
      2. set privacy settings on by default
      3. example
        1. jpg, gif, png to Aviary.com
        2. ppt, pptx, pps, odt to Slideshare.net
    2. see Sphinxsearch for my own desktop search equivalent solution
      1. indexes can be shared
  5. locate former links to local fs
    1. search for E:\Work\, file:///, etc
      1. e.g. Cloud:Documents/My%20Artwork/
    2. search for discussion, irc log, etc
      1. e.g. Discussion:seedeabitlbee/paola.log#Date
    3. temp solution via Lighttpd on http://cloud.benetou.fr (consider Cookbook:StringReplace)
    4. search for
  6. allow fast searches
    1. once files are uncompressed build an index
      1. check updatedb / mlocate / find
        1. Cloud:index.txt
      2. update automatically with
        1. Crontab
        2. Wikipedia:inotify with its inotify-tools and incron
    2. provide a private http interface to it
      1. Cloud:find
    3. index meta-data
    4. index content
      1. for pictures and their galleries
        1. 1 line pic gallery (requires phat connection to the server, no re-sized thumbnails)
          1. echo "<html>" > gallery.html && ls *.jpg |sort -n| sed "s/\(.*\)/<a href=\"\1\"><img height=\"200px\" src=\"\1\"\/>\1<\/a><br\/>/" > gallery.html && echo "</html>" >> gallery.html
          2. example http://fabien.benetou.fr/pub/researchnotes/
          3. use /etc/mime.types rather than harcdoded "jpg" extension
          4. efficiently use file and directory timestamps with Crontab vs gallery.html
          5. exifinfo, exifautotran, EXIFTool, exiftran, EXIFutils
        2. http://labs.ideeinc.com/ http://www.gazopa.com/ http://www.tineye.com/ http://www.numenta.com/vision/webservices.php
        3. consider OpenCV and dedicated GPU hardware
  7. export http://www.allmyapps.com/my/list/
    1. map configuration files to each application to the remote location of its backups
  8. manage emails
    1. set up mail server
      1. consider http://flurdy.com/docs/postfix/ and http://wiki.mutt.org/?MailConcept/Flow
    2. set up spam filter
      1. consider http://spamassassin.apache.org and http://postgrey.schweikert.ch
    3. set up webmail
      1. http://mail.benetou.fr
    4. set up DNS
    5. email friends with new adress
    6. redirect gmail non-spam to new address
      1. seems buggy
    7. download important emails from Gmail through pop/imap
      1. Backup up your GoogleMail locally with getmail by Ryan Cartwright, FreeSoftwareMagazine June 2010
      2. offlineimap Read/sync your IMAP mailboxes by jgoerzen
        1. OfflineIMAP with Mutt tutorial on ArchWiki
  9. automatize exchanges
    1. Crontab
      1. locally to
        1. download and unpack backups
      2. remotely to
        1. pack and make backups available
  10. check if OurPIM:Papers/PrivacySettings is respected
  11. ensure that the DNS if properly binded to most fundamental social services
    1. including VoIP
  12. test with Shell#EmbeddingShellClients
  13. consider content deliver per hostname
    1. especially initial configuration or configuration files
      1. e.g. this laptop would have this configuration, this one this other, etc...

Remarks

  • down to ~5 files 5Go, trick is 0 media content I didn't produce so no DivX or mp3 collection, helps a lot.
    • paradoxically at first was a kind of "discipline" but now is a pleasure since I use services like mixcloud and streaming websites with RSS
    • do not just upload my mp3 collection then listen to it but prefer not to have a collection but to link to innovative services that do
      • they are dedicated in that domain
  • my typing input, even if I go as fast as I can, stays rather low so I should never need fast upload that way
    • worse with asymmetric links, the case of ADSL
    • I should move pointers to data around, not the data themselves, except for the one I produced myself
      • Better Than Owning by Kevin Kelly, The Technium 2009
        • "in the near future, I won't <<own>> any music, or books, or movies. Instead I will have immediate access to all music, all books, all movies using an always-on service"
    • probably the only case in which large upstream bandwidth is required is video editing, pretty much everything else required large downstream bandwidth but low upstream
  • HD crash resilient and "cloud buzzword compatible" policy ;)
  • is it perfectly secure?
    • no, no system either offline or online, remote or local, is anyway
    • security by obscurantism does not work so posting information about this here is facilitating potential malicious person task but it is also strengthening my good practices
    • consequently remotely stored backups and information loops are provided by default so that when (not if) problems happen, I can recover and fast

See also

Motivation for GNU/Linux transition on the client/interface side

To do