Poor man's search - Strg+F is your friend :)

{
    "meta": {
        "title": "/open_geodata/",
        "subtitle": "a blog",
        "description": "about something",
        "author": "Thomas Kandler",
        "url": "http://opengeodata.de"
    },
    "pages": [
        {
            "title": "About",
            "date": "23.05.2017",
            "updated": "23.05.2017",
            "comments": true,
            "path": "about/index.html",
            "permalink": "http://opengeodata.de/about/index.html",
            "excerpt": "",
            "text": "The contents of this site are published by Thomas Kandler. Get in touch via email, if you like: h-all-o@thoma-s-k-andler.net (remove all hyphens before actually sending). If you need physical contact data, head over to denic. \ud83c\udf75"
        }
    ],
    "posts": [
        {
            "title": "Zonal Benchmark Tool",
            "slug": "zonalbenchmark",
            "date": "25.07.2017",
            "permalink": "http://opengeodata.de/2017/07/25/zonalbenchmark/",
            "text": "As a follow up to my idea at the Spatial Ecology summer school in Matera I\u2019d like to point you to my github repo: zonalbenchmark. The gist of it: \u201cWhat tool can I use to calculate the zonal statistics for my data set?\u201d I hope to ignite some thoughts about different tools for this task. If this little python script is furthermore useful for someone to justify a decision, I am more than happy. Sample usage:123python zonalStatBenchmark [tools] [input raster] [input mask / shape] [number of runs]python zonalStatBenchmark.py 1-2-3-4 test_data/wc2.0_10m_tavg_07.tif test_data/mask.shp 1 There\u2019s still a lot to to do; e.g.: add more tools for benchmarking improve error handling add support for \u2018per-feature-benchmarking\u2019 (load one feature from shape, run benchmark, load the next one, run benchmark, etc.) add multilanguage capabilities add test data to repo write unit tests If you\u2019re interested, please do not hesitate to contact me or to open an issue on github.",
            "tags": [
                {
                    "name": "python",
                    "slug": "python",
                    "permalink": "http://opengeodata.de/tags/python/"
                },
                {
                    "name": "summer school",
                    "slug": "summer-school",
                    "permalink": "http://opengeodata.de/tags/summer-school/"
                },
                {
                    "name": "matera",
                    "slug": "matera",
                    "permalink": "http://opengeodata.de/tags/matera/"
                },
                {
                    "name": "geospatial computing",
                    "slug": "geospatial-computing",
                    "permalink": "http://opengeodata.de/tags/geospatial-computing/"
                },
                {
                    "name": "grass",
                    "slug": "grass",
                    "permalink": "http://opengeodata.de/tags/grass/"
                },
                {
                    "name": "openforis",
                    "slug": "openforis",
                    "permalink": "http://opengeodata.de/tags/openforis/"
                },
                {
                    "name": "saga",
                    "slug": "saga",
                    "permalink": "http://opengeodata.de/tags/saga/"
                },
                {
                    "name": "pktools",
                    "slug": "pktools",
                    "permalink": "http://opengeodata.de/tags/pktools/"
                }
            ]
        },
        {
            "title": "Best of Bash 5",
            "slug": "best-of-bash-5",
            "date": "29.06.2017",
            "permalink": "http://opengeodata.de/2017/06/29/best-of-bash-5/",
            "text": "A thing the great Julia Evans was using in a recent blog post: tr.1234567tr - translate or delete characterscat /proc/3091/environ | tr '\\0' '\\n'-- get the environment variables from process 3091-- the environment variables contain hidden null bytes (\\0)-- which will be replaced with a new line (\\n) by tr Another quick and easy yet very helpful tool - mogrify. It is actually a common \u201chousehold remedy\u201d for a great deal of tasks in linux image processing, yet I wasn\u2019t aware of this simple usage example, which will convert any .bmp in a directory to .jpg. 1mogrify -format jpg *.bmp I wrote about the IP lookup in bash some time ago but this service is imho the simplest one: 12345curl ipinfo.iocurl ipinfo.io/ip -- display only IPcurl ipinfo.io/country -- display only country A nice snippet for finding the most recently changed files in a directory (and its subdirectories): find $1 -type f -exec stat --format '%Y :%y %n' "{}" \\; | sort -nr | cut -d: -f2- | head",
            "tags": [
                {
                    "name": "linux",
                    "slug": "linux",
                    "permalink": "http://opengeodata.de/tags/linux/"
                }
            ]
        },
        {
            "title": "Matera - Day 5",
            "slug": "Matera-Day-5",
            "date": "23.06.2017",
            "permalink": "http://opengeodata.de/2017/06/23/Matera-Day-5/",
            "text": "This day will be dedicated to projects of the participants. Conference call - Paul Harris Modelling heterogenity using geographically weighted modelling GWmodel R package, rpubs.com/gwmodel, geographically weighted modelling website Misc GRASS has a list of experimental addons IEEE Open Source for geospatial link list",
            "tags": [
                {
                    "name": "summer school",
                    "slug": "summer-school",
                    "permalink": "http://opengeodata.de/tags/summer-school/"
                },
                {
                    "name": "matera",
                    "slug": "matera",
                    "permalink": "http://opengeodata.de/tags/matera/"
                },
                {
                    "name": "geospatial computing",
                    "slug": "geospatial-computing",
                    "permalink": "http://opengeodata.de/tags/geospatial-computing/"
                },
                {
                    "name": "modelling",
                    "slug": "modelling",
                    "permalink": "http://opengeodata.de/tags/modelling/"
                },
                {
                    "name": "projects",
                    "slug": "projects",
                    "permalink": "http://opengeodata.de/tags/projects/"
                }
            ]
        },
        {
            "title": "Matera - Day 4",
            "slug": "Matera-Day-4",
            "date": "22.06.2017",
            "permalink": "http://opengeodata.de/2017/06/22/Matera-Day-4/",
            "text": "The course gets split up in basic R and advanced R users for the morning. Session 1 - R basics R has some bash capabilities and replicates all data which is being processed internally R stores variables in their data represention which is kept in the RAM; rather than just keeping the \u201cformula\u201d (like Python does) R is not really connected to bash; \u201center R, close the door to bash\u201d sudo needed for installing packages 1234567891011121314151617rm(a) - remove variablegc() - clean RAM (garbage collector)?<command> - get help on command (with examples)q() - quite Rsystem(\"pwd\") - run system commanddata.frame=read.table(\"filename\") - read something from outside into R into data frame str(data.frame) - show structure$ - indicates another level, e.g. landuse04$landusehead(data.frame) - show head of data frameobject.size(dem) - show byte size of data framedem$X=as.character(dem$X) - change data type of 'X' in 'dem' data frame to charactersave(landuse04, file=\"~/landuse2004.Rdata\") - save data frame as filesave.image() - save whole workspaceload(\"~landuse2004.Rdata\")rm(list = ls()) - remove everything in workspaceplot(landuse$fallow.Fallow, landuse$vineyard.Vineyards) - crude plottinglanduse[1:3 , 3:10] - access data via indices; first value pair = rows; second value pair = columns get information & inspiration for graphing/plotting: R graph gallery 12345678install.packages(\"raster\")library(raster)myinput=raster(\"/home/user/ost4sem/exercise/basic_adv_gdalogr/input.tif\")plot(myinput) -- install raster package, load it, load file and plot the raster -- raster is being kept in file instead of memory @ - sub-level indicator for raster images Session 2 - conference call Victoria O\u2019Brien - an introduction to sentinel satellite data Sentinel 1 - radar data, continously collecting data, 6 day cyle Sentinel 2 - visible data, multispectral, resolution: 10, 20, 60 meters, 5 day cycle data is kept by multiple organizations access via browser; e.g. copernicus open access hub (registration needed) access via tiles; use KML to get the organization of the tiles; get them on AWS also a CLI tool available: sentinalsat ESA SNAP - software for exploring, processing and classifying remote sensing data copernicus emergency management system; e.g. rapid flood mapping, forest fires, etc. Session 3 - R basics / distribution modelling BioClim as a way to incorporate multiple important indicators for analysis 1234rbind(presence,absence) - join two tables table(points$PA) - count occurences of attributena - handle missing values (omit, fail, etc.)c - combine values into list or vector sp-package is quite useful for spatial operations general idea for distribution modeling (sort of): get predictor, standardize, make tables, construct model with predictors Session 4 - GRASS basics bash is available in GRASS text mode 12grass70 -text ~/ost4sem/grassdb/europe/PERMANENT - start GRASS in textmode and load location 'europe' in 'grassdb'r.info --ui - runs the r.info function (info about a layer) and starts the GUI dialogue for it more interesting commands: 1234567891011121314151617181920g.copy rast=potveg_ita@Vmodel,pvegita - copy within GRASSg.remove -f type=raster name=pvegita - remove raster datasetg.region -p - get current region g.region n=6015390 e=5676400 s=3303955 w=3876180 res 1000 save=scandinavia --overwrite - set new regiong.region res=20000 -p - change resolutiong.gui tcltk - bring up GUI (possible arguments for GUI wxpython,text,gtext on this particular machine)g.list type=rast -p - list all raster maps (-p for pretty printing)# We can open a monitor and display a raster g.region rast=fnfpcd.mon start=x0d.rast fnfpc# and do the same thing for theother maps in different monitorsd.mon start=x1 d.rast fnfpc_alpine10k # get input into GRASSr.in.gdal input=~/ost4sem/exercise/basic_adv_grass/inputs/lc_cor2000/hdr.adf output=landcover Session 5 - remote sensing & machine learning Tom Jones, EO specialist, Catapult Satellite Application current time is disruptive in terms of satellite imaging; many factors are changing quite dramatically, so fast moving technology companies have real adavantage since 2013 rapid increase in downloads of Landsat scenes remotely monitored road conditions are one example for future applications OS software in the field: OpenCV, SciLab, RGISLib, SNAP, GDAL, scikit learn Proprietary software: IDL, eCognition, ArcGIS, FME, random bits: Jupyter notebooks, sklearn python module, ARCSI CLI pre-processing for remote sensing images, ShepherdSegmentation algorithm for clumping, tuiview",
            "tags": [
                {
                    "name": "python",
                    "slug": "python",
                    "permalink": "http://opengeodata.de/tags/python/"
                },
                {
                    "name": "summer school",
                    "slug": "summer-school",
                    "permalink": "http://opengeodata.de/tags/summer-school/"
                },
                {
                    "name": "matera",
                    "slug": "matera",
                    "permalink": "http://opengeodata.de/tags/matera/"
                },
                {
                    "name": "geospatial computing",
                    "slug": "geospatial-computing",
                    "permalink": "http://opengeodata.de/tags/geospatial-computing/"
                },
                {
                    "name": "r",
                    "slug": "r",
                    "permalink": "http://opengeodata.de/tags/r/"
                },
                {
                    "name": "remote sensing",
                    "slug": "remote-sensing",
                    "permalink": "http://opengeodata.de/tags/remote-sensing/"
                },
                {
                    "name": "grass",
                    "slug": "grass",
                    "permalink": "http://opengeodata.de/tags/grass/"
                }
            ]
        },
        {
            "title": "Matera - Day 3",
            "slug": "Matera-Day-3",
            "date": "21.06.2017",
            "permalink": "http://opengeodata.de/2017/06/21/Matera-Day-3/",
            "text": "A basic introduction to Python, its core concepts as well as problem solving strategies based on certain geospatial packages and internet research. instructor for the day: fpl Session 1 - Python basics ~90% of packages are ported to Python3 to this date; check if your a package you need is already ported: Py3Readiness 1chmod a+x - executable for all groups did not know about tuple assigment like below: 1234567#!/usr/bin/python#~ # This is the well-known Fibonacci seriesa, b = 0, 1while b < 2000: print a a, b = b, a + b keyword arguments can be useful if you want pass some, but not all, arguments to a function which takes multiple parameters 12345678910111213''' Keyword arguments in calling functions'''def fibonacci(n=2000): a, b = 0, 1 f = [] while b<n: f.append(a) a, b = b, a+b return f s = fibonacci(n=10000)print s Session 2 - Python OGR adding a custom path for modules: 1sys.path.append('/home/user/my_module') useful Python OGR functions 123GetFeatureCount() - returns feature countGetSpatialRef().ExportToProj4() - returns a proj4 stringGetPointCount() - returns point count a simple script for checking out some OGR Python functions attention (for me) to be paid to geometry.GetGeometryName() == 'POINT' for checking the Geometry type also an interesting (& crude) way to get CLI arguments: args = [] /n args.append(sys.argv) open a shape file shp = ogr.Open(shpFile) get geometry of feature - geometry = feature.GetGeometryRef() get field name - feature.GetField(fieldName) 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354#Examine a shapefile with ogrfrom osgeo import ogrimport osimport sysargs = []args.append(sys.argv)# set working diros.chdir('../files')# check if file name was giventry: shpFile = args[0][1]except: print 'No input file specified.' sys.exit(1)# check if field was giventry: fieldName = args[0][2]except: print 'No field specified.' sys.exit(1)# open the shapefileshp = ogr.Open(shpFile) # Get the layertry: layer = shp.GetLayer()except: print 'File not found.' sys.exit(1)# Loop through the features# and print information about themfor feature in layer: geometry = feature.GetGeometryRef() # check if the field name exists try: feature.GetField(fieldName) except: print 'Wrong field name given.' sys.exit(1) if geometry.GetGeometryName() == 'POINT': # print the info print geometry.GetX(), geometry.GetY(), feature.GetField(fieldName) else: print 'Only works for point geometries.' sys.exit(1) Session 3 - Python OGR get all functions for a object in OGR, open iypthon 123456789101112131415161718192021222324252627282930313233343536from osgeo import ogrshp = ogr.Open('point.shp')shp. + Press TAB> shp.CommitTransaction shp.GetDriver shp.GetMetadata_List shp.SetDescriptionshp.CopyLayer shp.GetLayer shp.GetName shp.SetMetadatashp.CreateLayer shp.GetLayerByIndex shp.GetRefCount shp.SetMetadataItemshp.DeleteLayer shp.GetLayerByName shp.GetStyleTable shp.SetStyleTableshp.Dereference shp.GetLayerCount shp.GetSummaryRefCount shp.StartTransactionshp.Destroy shp.GetMetadata shp.Reference shp.SyncToDiskshp.ExecuteSQL shp.GetMetadataDomainList shp.Release shp.TestCapabilityshp.FlushCache shp.GetMetadataItem shp.ReleaseResultSet shp.nameshp.GetDescription shp.GetMetadata_Dict shp.RollbackTransaction shp.thislayer = shp.GetLayer()layer. + Press TAB>layer.AlterFieldDefn layer.GetFeature layer.GetSpatialRef layer.SetNextByIndexlayer.Clip layer.GetFeatureCount layer.GetStyleTable layer.SetSpatialFilterlayer.CommitTransaction layer.GetFeaturesRead layer.Identity layer.SetSpatialFilterRectlayer.CreateFeature layer.GetGeomType layer.Intersection layer.SetStyleTablelayer.CreateField layer.GetGeometryColumn layer.Reference layer.StartTransactionlayer.CreateFields layer.GetLayerDefn layer.ReorderField layer.SymDifferencelayer.CreateGeomField layer.GetMetadata layer.ReorderFields layer.SyncToDisklayer.DeleteFeature layer.GetMetadataDomainList layer.ResetReading layer.TestCapabilitylayer.DeleteField layer.GetMetadataItem layer.RollbackTransaction layer.Unionlayer.Dereference layer.GetMetadata_Dict layer.SetAttributeFilter layer.Updatelayer.Erase layer.GetMetadata_List layer.SetDescription layer.nextlayer.FindFieldIndex layer.GetName layer.SetFeature layer.schemalayer.GetDescription layer.GetNextFeature layer.SetIgnoredFields layer.thislayer.GetExtent layer.GetRefCount layer.SetMetadata layer.GetFIDColumn layer.GetSpatialFilter layer.SetMetadataItem",
            "tags": [
                {
                    "name": "python",
                    "slug": "python",
                    "permalink": "http://opengeodata.de/tags/python/"
                },
                {
                    "name": "summer school",
                    "slug": "summer-school",
                    "permalink": "http://opengeodata.de/tags/summer-school/"
                },
                {
                    "name": "matera",
                    "slug": "matera",
                    "permalink": "http://opengeodata.de/tags/matera/"
                },
                {
                    "name": "geospatial computing",
                    "slug": "geospatial-computing",
                    "permalink": "http://opengeodata.de/tags/geospatial-computing/"
                }
            ]
        },
        {
            "title": "Matera - Day 2",
            "slug": "Matera-Day-2",
            "date": "20.06.2017",
            "permalink": "http://opengeodata.de/2017/06/20/Matera-Day-2/",
            "text": "After a surprisingly swift intro to Linux and bash the day will evolve around the GDAL library as well as the - relatively obscure - pktools. Session 1 - gdalEPSG / Spatial Referece Information gdalinfo does not include the nodata value in the statistics, mind the data-type gdal_translate can be used for tiling (as an example) -projwin ulx uly lrx lry when cropping with gdal_translate (or any software) the cropping window does eventually not match perfectly, so the software will resample (or shift) automatically in the background \u201cThe VRT driver is a format driver for GDAL that allows a virtual GDAL dataset to be composed from other GDAL datasets with repositioning, and algorithms potentially applied as well as various kinds of metadata altered or added.\u201d VRT can be used to link together many files or create subsets (extents) of images 1gdal_translate --formats | grep ENVI - find if gdal supports the format you want ogrinfo can have very useful ouput an also supports SQL statements 12345678ogrinfo -al shape.shp>> OGRFeature(poly_fr_10poly):0 id (Integer64) = 2 region (Integer) = 2 POLYGON ((3872295.18072289 2681195.78313253,3915993.97590361 2666629.51807229,3901427.71084337 2615647.59036145,3872295.18072289 2681195.78313253)) Session 2 - gdal, bash scripting OpenEV - part of FWTools 12openev TCmean01-10_1km.tif - quickly view a image or vector (uses GDAL)STRL + \\ - kill application more AWK stuff for file in *.tif ; do gdalinfo -mm $file | grep \u201cMin=\u201d | awk -F \u201c,\u201d \u2018{ print $2 }\u2019 done set comma separation, print second column get resolution of each TIF file and display only those with 240 pixel 1234567891011 for file in *.tif ; do echo $file $(gdalinfo -mm $file | grep \"Size is \")done | grep \"240\"- with added basename function:for file in *.tif ; do echo $(basename $file ) $(gdalinfo -mm $file | grep \"Size is \")done | grep \"240\" Bash string manipulation 1grep -v \"inverted\" - do an inverted grep Session 3 - pktoolspktools are based on gdal but go further in many ways; e.g. extract the bounding box coords without grepping or awking. Written in C++, good documentation, relatively narrow focus. Session 4 - openforis oft-tools quite usable tools from Open Foris learning strategy: 1) GDAL 2) PKtools 3) Open Foris tools oft-stat can be used for zonal statistics (very fast): 12oft-stat -i INPUT.tif -o output.txt -um INPUT_MASK.tif >> INPUT_MASK = rasterized vector file oft-clump and oft-bb can be used for preprocessing by finding the correct bounding boxes to do the zonal statistics with todo: do a benchmark between QGIS, pktools and GRASS (maybe SAGA, perrygeo gist, Python package, starspan, R, R - multicore, PostGIS, GeoTools) zonal statistics and oft-stat (be aware of the zonal freaks)",
            "tags": [
                {
                    "name": "summer school",
                    "slug": "summer-school",
                    "permalink": "http://opengeodata.de/tags/summer-school/"
                },
                {
                    "name": "matera",
                    "slug": "matera",
                    "permalink": "http://opengeodata.de/tags/matera/"
                },
                {
                    "name": "geospatial computing",
                    "slug": "geospatial-computing",
                    "permalink": "http://opengeodata.de/tags/geospatial-computing/"
                },
                {
                    "name": "bash scripting",
                    "slug": "bash-scripting",
                    "permalink": "http://opengeodata.de/tags/bash-scripting/"
                },
                {
                    "name": "gdal",
                    "slug": "gdal",
                    "permalink": "http://opengeodata.de/tags/gdal/"
                },
                {
                    "name": "oft",
                    "slug": "oft",
                    "permalink": "http://opengeodata.de/tags/oft/"
                },
                {
                    "name": "openforis",
                    "slug": "openforis",
                    "permalink": "http://opengeodata.de/tags/openforis/"
                }
            ]
        },
        {
            "title": "Matera - Day 1",
            "slug": "Matera-Day-1",
            "date": "19.06.2017",
            "permalink": "http://opengeodata.de/2017/06/19/Matera-Day-1/",
            "text": "Starting of the summer school in Matera; after the introduction we get to know Linux and bash or are trying to learn more. This was refreshing or new for me \u2026 Session 1 - bash basics123456789101112pwd -> current dir man -k count -> search for command involving the keyword (-ks) cd ../.. -> go up two directories& (at end of command) -> run program in background, keep terminal usablefg -> will resume the most recently suspended or backgrounded jobps -aux | grep evince - get PID for evinceCTRL + L - scroll to current command hiding everythingCTRL + A - go to beginning of a commandll - same as ls -lmore - open a text file partially!! - repeats the last commanddu -hs * | sort -hr - list all directories sorting by size PCManFM supports tab completion in the path. Command Line Bootcamp Session 2 - bash basicsString manipulation1234* - a string with o or more character -> ls /dev/tty*? - a single character -> ls /dev/tty?[ ] - one of a single character listed -> ls /dev/tty[2-4]{} - one of a single string listed -> ls /dev/tty{zd,zc} Misc12345find /home/user -size +5M -name "*.pdf" | xargs du -sh find PDF files which are bigger than 5MB and display file sizeseq 1 100 - generate sequence from 1 to 100 grep "2002 06" input.txt - grep two columns in input.txt (searching for June 2002) For-loop1234var=$(grep \"2007\" input.txt | wc -l) - set a command result to a variablefor ((var=2005 ; var<=2007 ; var++)); do grep $var input.txt | wc -l || echo $var; done - simple for-loopfor var in $(seq 2005 2007); do grep $var input.txt | wc -l; done - same simple for-loopfor var in $(seq 2005 2007); do grep $var input.txt | echo $(wc -l) $(echo $var); done - same simple for-loop with printing the $var also Session 3 - bash basics & AWKAWK processes file in cascade mode - line for line. It is most useful for data reduction. Can be used for pre-processing (calculations before importing into other programs) as it is sometimes more efficient. awk '{ print $5 , $2 }' input.txt # print a column 5 and 2 (space seperated) awk '{ print $5 "," $2 }' input.txt # print a column 5 and 2 (comma seperated) awk '{ print NF }' input.txt # print number of columns (count) awk '{ print NR }' input.txt # print number of rows (count) awk '{ print substr($1,1,4) }' input.txt # string manipulation awk -v # import variable in awk query Associative array as a powerful concept. awk '{ Year[$2]++; } END { for (var in Year) print var, Year[var]," data points"}' input.txt Further reading on sed to be done (in case of string-only operations).",
            "tags": [
                {
                    "name": "bash",
                    "slug": "bash",
                    "permalink": "http://opengeodata.de/tags/bash/"
                },
                {
                    "name": "summer school",
                    "slug": "summer-school",
                    "permalink": "http://opengeodata.de/tags/summer-school/"
                },
                {
                    "name": "matera",
                    "slug": "matera",
                    "permalink": "http://opengeodata.de/tags/matera/"
                },
                {
                    "name": "geospatial computing",
                    "slug": "geospatial-computing",
                    "permalink": "http://opengeodata.de/tags/geospatial-computing/"
                },
                {
                    "name": "awk",
                    "slug": "awk",
                    "permalink": "http://opengeodata.de/tags/awk/"
                }
            ]
        },
        {
            "title": "Starting with Kivy",
            "slug": "starting-with-kivy",
            "date": "29.05.2017",
            "permalink": "http://opengeodata.de/2017/05/29/starting-with-kivy/",
            "text": "How to set up kivy. 12sudo add-apt-repository ppa:kivy-team/kivy sudo apt-get update && sudo apt-get install python3-kivy python-kivy-examples Get started with the first app and relase it via buildozer (Android will be the target). 1234567891011121314git clone https://github.com/kivy/buildozer.gitcd buildozersudo python2.7 setup.py installcd PROJECT_DIR && buildozer init# dependencies for android release on Ubuntu 16.04sudo pip install --upgrade cython==0.21sudo dpkg --add-architecture i386sudo apt-get updatesudo apt-get install build-essential ccache git libncurses5:i386 libstdc++6:i386 libgtk2.0-0:i386 libpangox-1.0-0:i386 libpangoxft-1.0-0:i386 libidn11:i386 python2.7 python2.7-dev openjdk-8-jdk unzip zlib1g-dev zlib1g:i386# deploybuildozer android debug deploy run",
            "tags": [
                {
                    "name": "kivy",
                    "slug": "kivy",
                    "permalink": "http://opengeodata.de/tags/kivy/"
                },
                {
                    "name": "python",
                    "slug": "python",
                    "permalink": "http://opengeodata.de/tags/python/"
                },
                {
                    "name": "app",
                    "slug": "app",
                    "permalink": "http://opengeodata.de/tags/app/"
                },
                {
                    "name": "dev",
                    "slug": "dev",
                    "permalink": "http://opengeodata.de/tags/dev/"
                }
            ]
        },
        {
            "title": "Best of Bash 4",
            "slug": "best-of-bash-4",
            "date": "23.05.2017",
            "permalink": "http://opengeodata.de/2017/05/23/best-of-bash-4/",
            "text": "Recently I was looking for a digital punch card. Of course, I could just use the w command, subtract my breaks and that\u2019s that at the end of the day. But there\u2019s always something not work related which needs to be remembered and after all: I am lazy.So, I installed sp.app.myWorkClock on my phone, which has a nice widget to literally punch (touch) in and out. The output of my Work Clock is a SQLite3 database. To use it in bash I need to do sudo apt-get install sqlite3. The export_punches.sh looks like this (via SO): 1234567#!/bin/bashsqlite3 ~/PunchClock_*.db <<!.headers on.mode csv.output out.csvselect punchId, strftime('%Y-%m',startTime) as context, strftime('%d',startTime) as day, round(strftime('%H',startTime) + (strftime('%M',startTime)/60.0),1) as start, round(strftime('%H',endTime) + (strftime('%M',endTime)/60.0),1) as end from WorkTimes;! This redirects basically everything inbetween the exclamation marks to the sqlite3 program. It turns headers on, sets CSV mode, defines the output file and states a SQL command which will output the data the way I need it (company time cards count and add hours in decimal mode). And that\u2019s that.",
            "tags": [
                {
                    "name": "linux",
                    "slug": "linux",
                    "permalink": "http://opengeodata.de/tags/linux/"
                },
                {
                    "name": "android",
                    "slug": "android",
                    "permalink": "http://opengeodata.de/tags/android/"
                },
                {
                    "name": "time-tracking",
                    "slug": "time-tracking",
                    "permalink": "http://opengeodata.de/tags/time-tracking/"
                }
            ]
        },
        {
            "title": "Best of Bash 3 (Hexo edition)",
            "slug": "best-of-bash-3",
            "date": "23.05.2017",
            "permalink": "http://opengeodata.de/2017/05/23/best-of-bash-3/",
            "text": "How to setup and use hexo (the engine running this) coming from Wordpress using rsync as deployment method: 12345678npm install -g hexo-clihexo init cool-blogcd cool-blognpm install hexo-deployer-rsync --savenpm install hexo-migrator-wordpress --savehexo migrate wordpress ~/cool-old-blog.xml // generates new files in ./sourcehexo new \"New cool post\"hexo generate --deploy Additional: 12npm install hexo-autolinker --savenpm install hexo-generator-seo-friendly-sitemap --save Plus, one can tinker around in the _config.yml as pleases (this is also where the rsync deployer is configured).",
            "tags": [
                {
                    "name": "linux",
                    "slug": "linux",
                    "permalink": "http://opengeodata.de/tags/linux/"
                },
                {
                    "name": "bash",
                    "slug": "bash",
                    "permalink": "http://opengeodata.de/tags/bash/"
                },
                {
                    "name": "hexo",
                    "slug": "hexo",
                    "permalink": "http://opengeodata.de/tags/hexo/"
                }
            ]
        },
        {
            "title": "Best Of Bash 2",
            "slug": "best-of-bash-2",
            "date": "02.12.2016",
            "permalink": "http://opengeodata.de/2016/12/02/best-of-bash-2/",
            "text": "The second installment of nice bash-y things. A bit lewd, yet simple and useful - check your IP, location and ISP in bash. 1curl https://wtfismyip.com/json 2>&1 | grep -E 'YourFuckingLocation|YourFuckingIPAddress|YourFuckingISP' or1wget -O - https://wtfismyip.com/json 2>&1 | grep -E 'YourFuckingLocation|YourFuckingIPAddress|YourFuckingISP'` Javier L\u00f3pez wrote a good tool for people which have the need to connect to a DLNA server without much fuzz. So, to search for something on the DLNA server which has \u2018Gravity\u2019 in its name, just type: 1./simple-dlna-browser.sh -v Gravity Need to contain software like Firefox or Skype? Try firejail. Thanks to pre-made configs, using it can be as simple as: 1firejail skypeforlinux",
            "tags": [
                {
                    "name": "linux",
                    "slug": "linux",
                    "permalink": "http://opengeodata.de/tags/linux/"
                },
                {
                    "name": "bash",
                    "slug": "bash",
                    "permalink": "http://opengeodata.de/tags/bash/"
                },
                {
                    "name": "IP",
                    "slug": "IP",
                    "permalink": "http://opengeodata.de/tags/IP/"
                },
                {
                    "name": "networking",
                    "slug": "networking",
                    "permalink": "http://opengeodata.de/tags/networking/"
                },
                {
                    "name": "dlna",
                    "slug": "dlna",
                    "permalink": "http://opengeodata.de/tags/dlna/"
                },
                {
                    "name": "media",
                    "slug": "media",
                    "permalink": "http://opengeodata.de/tags/media/"
                },
                {
                    "name": "sandbox",
                    "slug": "sandbox",
                    "permalink": "http://opengeodata.de/tags/sandbox/"
                }
            ]
        },
        {
            "title": "Best of bash 1",
            "slug": "best-of-bash-1",
            "date": "30.10.2016",
            "permalink": "http://opengeodata.de/2016/10/30/best-of-bash-1/",
            "text": "As a mean to reflect and preserve certain useful commands, I\u2019ll start this little series. Here we go: 1mogrify -resize 50x50% -quality 90 -format jpg *.JPG Take all JPG files in one folder and reduce their size by 50%. 1234sudo add-apt-repository ppa:fossfreedom/byzanzsudo apt-get update && sudo apt-get install byzanzbyzanz-record -c -d 120 --delay=3 record.gifffmpeg -i record.gif -movflags faststart -pix_fmt yuv420p -vf \"scale=trunc(iw/2)_2:trunc(ih/2)_2\" video.mp4 Get the byzanz-record tool which will create a relatively small GIF of you using your screen to quickly solve a problem and show this to someone else. ffmpeg will convert it to a video in case it is neccessary. 1for i in {01..12}; do gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite -dFirstPage=$i -dLastPage=$i -sOUTPUTFILE=output_$i.pdf certificates.pdf; done Convert a 12-paged PDF file to 12 single-paged files. 1nmap -p 22 --open -sV 192.168.1.0/24 > sshservers.txt Scan a local network for ssh-enabled devices; useful for finding your Raspberry Pi in a semi-public WLAN.",
            "tags": [

            ]
        },
        {
            "title": "Don't call it a hackathon",
            "slug": "dont-call-it-a-hackathon",
            "date": "17.03.2016",
            "permalink": "http://opengeodata.de/2016/03/17/dont-call-it-a-hackathon/",
            "text": "What a great tweet came across my way: "Hackathon" and competition do not attract women to tech programs. Great reflection and pivot from NCSU #c4l16 pic.twitter.com/YblhDNKCa0— Erin White (@erinrwhite) 10. M\u00e4rz 2016 And because the author of this tweet remarked, it is interesting to see this retweeted hundreds of times while the talk itself is on quite different topic, I will post the transcript of said talk. (Also because the talk seems awesome) Hi my name is Allison, from NC Libraries, here to talk about a project called code art, this is a project I took over managing last July and in second year.It started as contest for students, for display on large scale video walls in the library, which Heidi mentioned earlier.This is art created with autonomous system and can run, for example, on computer algorithms.So the library opened in 2013, four video walls built into the public spaces of library were intended to be canvass for the library to show student and faculty work.So the code art contest was created to advertise this, it was sponsored by digital systems, maker of [Name?], the competition with a substantial monetary of prize, hundreds of dollars for first and second place winners would be attracted to students, along with getting the exhibit to work in the library.Another aim of the contest was to get awareness, coding, and encourage students to learn to code who wouldn\u2019t of considered it a possibility.Making art with code, this includes processing.Last year in 2015 the contest structure required interest students write a written proposal, they competed for the final judging.The projects were developed over a few months.The outcome of last year\u2019s contest was that we had two very impressive pieces produced for video walls, created with data, code and stand-alone art.The winner was Forest, entry microcontroller to make trees that grow in a planet, so sun and moon revolve and serve as hands of a clock.The WKP visualizer visualizes, birds flying over the sky line of Raleigh.It was visualized in the building with the light flowing up and down.Taking over the project in July I set some improvement goals for 2016 including more student participation, more diverse participation in terms of students participating in terms of their identities and also program of study.More faculty involvement and mentorship of participants who might be interested in entering the contest.One challenge, and potential opportunity there, the pool of students who already make code on campus is pretty small and hard to identify.Very few courses on campus related to making art with code.The coding there in computer science program about a thousand undergraduate and graduate students, not clear how many are interested in art.How many are interested in coding, the design and using digital design tools, however.Creative, eager to help with advertising and mentoring students.Also creating new classes, involve creating coding in some fashion.The deadline is next week for the contest.Planned a series of events in maker space that allows students with no experience to get hands-on experience and make something, these creations would be eligible to submission to the contest.Interestingly, while the workshop and hackathon, no experience necessary, they drew different audiences.Perhaps people waited to enter because of the title hackathon, maybe a certain kind of competition, workshop seems more accessible.Just last week, the studies from national University of Singapore, highly competitive settings woman made weighted, qualified woman may be discouraged from competing.Due to structural courses in society, competitions may not be the best way to identify talent.Most talented may not be competing.It may be that more non-competitive programming is a key to building this on campus, we can support this year the modest gains, we have modest gains in number of woman, students of color and non-coders who participated in the program in contest more specifically, we have more work to do this.This when includes shifting focus from being just a contest to more robust and inclusive program, more opportunity for underrepresented students.I believe we can develop community for everyone that wants to learn to make art with code will feel empowered to do so.Thank you. [APPLAUSE]",
            "tags": [

            ]
        },
        {
            "title": "Centerline / Skeleton of Polygon with PostGIS",
            "slug": "centerline-skeleton-of-polygon-with-postgis",
            "date": "10.09.2015",
            "permalink": "http://opengeodata.de/2015/09/10/centerline-skeleton-of-polygon-with-postgis/",
            "text": "Suppose you want to have the center line of a polygon. Further suppose you do not have access to proprietary means for this goal. PostGIS with SFCGAL comes to the rescue. SFCGAL will enable the ST_StraightSkeleton function in PostGIS and is currently available in PostGIS >2.1.x. User Zia posted a good How-To on SE. Once you are set with PostGIS and SFCGAL, you can go ahead using the following query: with xxx as ( select objectid, -- dump MultiLineString into seperate parts (st_dump(ST_StraightSkeleton(geometry))).path[1], (st_dump(ST_StraightSkeleton(geometry))).geom as geometry from table ) select * from xxx -- make sure the seperate parts which are within 1m of the exterior of the polygon do not get into the result where not st_dwithin(xxx.geometry, st_exteriorring((select geometry from table)), 1) -- get rid of some of the lose ends which do not touch any line and ((st_touches(st_startpoint(xxx.geometry), (select st_union(geometry) from xxx)) AND st_touches(st_endpoint(xxx.geometry), (select st_union(geometry) from xxx)))); The operation will be quite costly, so better run it in pgsql2shp or ogr2ogr in order to write to a file rather than your DB application. The latter one would work like this: ogr2ogr -f "ESRI Shapefile" shapefilename.shp PG:"host=localhost user=user dbname=db_name password=pass" -sql "the query" After this you\u2019ll need to clean up a bit. Or you can set a treshold for ST_Length and include it in the WHERE clause. It will not work perfectly but reasonably better than manual work in most occasions. Especially analysis on large polygons will benefit.",
            "tags": [

            ]
        },
        {
            "title": "TV3 News aus Radio & Presse -  Bulk Download with Bash",
            "slug": "tv3-news-aus-radio-presse-bulk-download-with-bash",
            "date": "09.09.2015",
            "permalink": "http://opengeodata.de/2015/09/09/tv3-news-aus-radio-presse-bulk-download-with-bash/",
            "text": "Maybe someone finds this useful by either speaking/learning German or having a similar task at hand. The TV3 news site is a daily updated site with radio features and news from the day before. Always interesting to listen to. Bulk download for a whole day can be done via the .m3u file which is also updated daily and has a consistent date string as filename. Therefore some wget and bash scripting will do. ` #!/bin/bashfoo=\u2019http://www.tvdrei.de/POD/POD/Archiv/2015/Playlist/\u2018bar=$(date +%Y%m%d -d \u201cyesterday\u201d)rar=\u2019.m3u\u2019 wget $foo$bar$rar -O $bar.txt wget -c -nc -i $bar.txt rm $bar.txt`",
            "tags": [

            ]
        },
        {
            "title": "todo.txt",
            "slug": "todo-txt",
            "date": "22.01.2015",
            "permalink": "http://opengeodata.de/2015/01/22/todo-txt/",
            "text": "If you know some German, see this article at t3n, otherwise check out the Git for todo.txt. If you want to make use of the todo.txt command-line tool real quick, add this to your .bashrc (Linux) or .bash_profile (Windows/Mac):PATH=$PATH:"/path/to/todo.sh/folder/" export TODOTXT_DEFAULT_ACTION=ls alias t='todo.sh -d /path/to/your/todo.cfg' You can now use t to add, delete, update tasks.",
            "tags": [

            ]
        },
        {
            "title": "QGIS FieldPyculator - Area",
            "slug": "qgis-fieldpyculator-area",
            "date": "20.01.2015",
            "permalink": "http://opengeodata.de/2015/01/20/qgis-fieldpyculator-area/",
            "text": "As QGIS 2.6 has a strange bug when it comes to the field calculator and certain PostGIS-Layers, I got used to a plugin named FieldPyculator. The plugin has slightly different, python-esque syntax which leads me to noting how to calculate an integer area for a geometry object. value = int($geom.area())",
            "tags": [

            ]
        },
        {
            "title": "Browse bluetooth connected phone with linux (Crunchbang Debian)",
            "slug": "browse-bluetooth-connected-phone-with-linux-crunchbang-debian",
            "date": "20.08.2014",
            "permalink": "http://opengeodata.de/2014/08/20/browse-bluetooth-connected-phone-with-linux-crunchbang-debian/",
            "text": "As I was looking into getting some pictures from my phone to the local machine, I stumbled upon a quite annoying bug in Ubuntu 14.04 which seems to prevent a stable connection to share data between devices. I use Ubuntu (besides #!) mostly for some multimedia or plug-n-play stuff. So this is quite annoying. Luckily the #! community is crafty and came up with a solution for those doubting their sanity using tools like bluez or blueman which don\u2019t seem to work extraordinarily reliable. sudo apt-get install gvfs-bin sudo apt-get install gvfs-fuse sudo hcitool scan .. Scanning ... .. xx:xx:xx:xx:xx:xx YourPhone gvfs-mount obex://[xx:xx:xx:xx:xx:xx] This will mount your device to ~/.gvfs and you can just do all the stuff you normally do on a filesystem; e.g. copy all pictures: cp -avrn ~/.gvfs/YourPhone/scdcard0/DCIM/100DSC/ /home/user/images/ Unmount with gvfs-mount -u obex://[xx:xx:xx:xx:xx:xx] or just turn off bluetooth on your phone.",
            "tags": [

            ]
        },
        {
            "title": "The pragmatic programmer",
            "slug": "the-pragmatic-programmer",
            "date": "05.08.2014",
            "permalink": "http://opengeodata.de/2014/08/05/the-pragmatic-programmer/",
            "text": "I am rarely happier than when spending an entire day programming my computer to perform automatically a task that it would otherwise take me a good ten seconds to do by hand. \u2014Douglas Adams, \u201cLast Chance To See\u201d (via)",
            "tags": [

            ]
        },
        {
            "title": "Poor man's Pomodoro in Linux Terminal",
            "slug": "poor-mans-pomodoro-in-linux-terminal",
            "date": "23.07.2014",
            "permalink": "http://opengeodata.de/2014/07/23/poor-mans-pomodoro-in-linux-terminal/",
            "text": "Well, actually I find this approach rather minimalistic than poor but it certainly lacks some comfort. If you like the Pomodoro Technique check out this little line of code: sleep 1500 &amp;&amp; notify-send -u critical -i "/usr/share/pixmaps/waldorf.png" 'Pomodoro' '5min Pause' This will sleep in the terminal for 25 minutes to wake up and show you a notification which hides only onclick (hence the -u critical). -i will embed an icon you may find suitable, first string is a heading, second string is the actual message. If notify-send doesn\u2019t work, make sure you have libnotify installed. (via su and magnatecha)",
            "tags": [

            ]
        },
        {
            "title": "Random notes (1) - Linux SysAdmin",
            "slug": "random-notes-1-linux-sysadmin",
            "date": "04.07.2014",
            "permalink": "http://opengeodata.de/2014/07/04/random-notes-1-linux-sysadmin/",
            "text": "These notes were written with some prior knowledge of Linux and therefore may just represent some horrendous knowledge gaps of mine. Thanks to Dave from the tutoriaLinux yt-channel; check out his videos. See Github for nicer formatting. Basics terminalpwd - print working directory rmdir - remove dir (empty) man program - manual ls -s - symbolic link head - first 10 lines of file (default) tail - last 10 lines of file (default) tail -f /var/log/dmesg - follow the ende of the file (useful for logs) poweroff \u2013 init 0 / init 6 - shutdown / restart cp - copy cd ../../.. - go up 3 directories ls -lh - long list human readable sudo -i - interactive root session wc -l - count stuff df -h - list mounted devices (human readable) cut -d: -f2 - take some (piped) input, look for delimiter \u201c:\u201d, take stuff from second field; so Key1: Value1 will return Value1sort -bf - sort by first letter uniq - print only unique wc - word count grep - searching, finding, filtering (powerful, learn more)which - shows the full path of (shell) commandswhereis - where is a command (binary, source, manual)locate - find files by name cat /etc/network/interfaces - list network devices/interfaces Pipes and Redirection| - pipe character echo "hello world" &gt; hello.txt - write things to file; truncates before writing echo "hello world" &gt;&gt; hello.txt - appends output there are three channels: 0 - Standard Input (STDIN), 1 - Standard Output (STDOUT) and 2 - Standard Error (STDERR) to catch STDERR \u2013> 2&gt; (channel two), e.g. ls -lh someNoneExistingFile.txt 2&gt; action.log input redirection: mail -s "this is a test" thomas &lt; message.txt ps | less - show all processes and pipe it into the program less which shows big texts in way which is easy to navigate &amp;&amp; - check if left command is successful, then execute the right command ls file.txt &amp;&amp; echo "Success." > Success ls wrongfile.tct && echo \u201cSuccess.\u201d > Error vi basics:wq! - write, quit, don\u2019t prompt me Package managementapt-cache search ... - search for package (Ubuntu/Debian) apt-get remove ... - remove package apt-get autoremove - clean up unneeded packages Processesps aux | grep "process name" - get info about process kill PID - kill process (SIGTERM = 15) with specified PID pkill -u USERNAME - kill process of user nice -n 15 program - start a program with low priority (19 = lowest; -20 highest) renice -5 PID - change niceness (aka priority) of process /proc - directory of all processes which is managed by the kernel and holds all the information about processes (in sort of files) Filesystemman hier - man page on filesystem hierarchy (overview on filesystem) udevd - device daemon Places /bin - binaries for applications /boot - boot images /dev - devices /etc - configuration data for applications /home /lib / /lib64 - shared libraries* /mnt - mount /proc - process directory (informations about running processes) /opt - optional software, no clear convention using this /sbin - system binaries /tmp - temporary files, cleaned on restart /usr - non-essential binaries /var/log - system logs absolute and relative paths: /home/user/downloads and downloads/ Filetypes (with flag/first bit on ls -l) Regular file (-)Directory (d) Character Device (c) Block Device (b) Local Domain Socket (s) Named Pipe (p) Symbolic Link (l) File permissions rwx rw- r-- - owner read/write/execute group read/write anyone read chmod 777 - rwx for owner, group, anyone chmod 666 - rw- for owner, group, anyone chmod 444 - r-- for owner, group, anyone chmod 000 - --- for owner, group, anyone LXC (LinuX Containers)when operating with LXC one should be root; even basic stuff like lxc-ls will need root privileges /var/cache/lxc/distro - contains the cached images needed for creation of a LXC /var/lib/lxc/ - contains files for every created container (including rootfs) /var/lib/lxc/myfirstcontainer/config - config file (see man 5 lxc.container.conf lxc-create -t ubuntu -n myfirstcontainer - type = ubuntu, name = myfirstcontainer; note: type takes host system defaults if not otherwise specified regarding architecture and what not; note further: will do a net install which is stored lxc-ls --fancy - list running machines lxc-start -n myfirstcontainer -d - start LXC in daemon mode; doesn\u2019t hog up the current shell session, starts in background; connect via SSH to IPV4 lxc-stop -n myfirstcontainer -k - stop plus kill lxc-freeze -n myfirstcontainer - freezes the proccess lxc-attach -n myfirstcontainer - attaches current shell to container (avoiding to SSH in)",
            "tags": [

            ]
        },
        {
            "title": "Leaflet NomNom",
            "slug": "leaflet-nomnom",
            "date": "27.06.2014",
            "permalink": "http://opengeodata.de/2014/06/27/leaflet-nomnom/",
            "text": "If you\u2019d like the above map in your website, check out my little repo on Github: leaflet-nomnom. It consists mainly of a tiny geocoding python script (which undoubtly has its flaws - like using no QA checks) and some JS code for calling Leaflet with a randomized city from the geocoded set.",
            "tags": [

            ]
        },
        {
            "title": "PostGIS Setup",
            "slug": "postgis-setup",
            "date": "06.06.2014",
            "permalink": "http://opengeodata.de/2014/06/06/postgis-setup/",
            "text": "Just for ease of copy&paste (oughta make a script out of this). Install PostGIS (assuming Postgre is already installed); change version number, if neccessary sudo apt-get install postgis postgresql-9.3-postgisSetup a new database (with UTF-8 encoding): psql -U usernamecreate database spatial_database;Enable PostGIS for that database: psql -d spatial_database -f /usr/share/postgresql/9.3/contrib/postgis-2.1/postgis.sql;psql -d spatial_database -f /usr/share/postgresql/9.3/contrib/postgis-2.1/spatial_ref_sys.sql;psql -d spatial_database -f /usr/share/postgresql/9.3/contrib/postgis-2.1/postgis_comments.sql;Done. :]",
            "tags": [

            ]
        },
        {
            "title": "PostgreSQL and UTF-8 encoding (or: getting rid of SQL_ASCII)",
            "slug": "postgresql-and-utf-8-encoding-or-getting-rid-of-sql-ascii",
            "date": "04.06.2014",
            "permalink": "http://opengeodata.de/2014/06/04/postgresql-and-utf-8-encoding-or-getting-rid-of-sql-ascii/",
            "text": "After a vanilla installation of PostgreSQL on Ubuntu (12.04) you most likely will end up with the quite useless SQL_ASCII encoding for your tables. UTF-8 is handy for pretty much everything; so let's set UTF-8. First things first: I am starting out with an empty, new database (cluster). If you have no actual data to convert you set UTF-8 in two ways: 1) If you would like create a whole new cluster, use initdb. For example, one could do this: su postgres #switch to user postgis; necessary for the call of initdb cd /usr/lib/postgresql/9.3/bin ./initdb --pgdata /var/lib/postgresql/9.3/&lt;newdatabasecluster&gt; -E &#39;UTF-8&#39; --lc-collate=&#39;en_US.UTF-8&#39; --lc-ctype=&#39;en_US.UTF-8&#39;`</pre> Just switch the `&lt;newdatabasecluster&gt;` with a name you&#39;d like and you&#39;re set. 2) If you would like to have your new databases in UTF-8 without creating a new cluster: <pre>`psql -U &lt;user&gt; template1 # &lt;user&gt; could be postgres or any other user with sufficient rights update pg_database set encoding = 6, datcollate = &#39;en_US.UTF8&#39;, datctype = &#39;en_US.UTF8&#39; where datname = &#39;template0&#39;; update pg_database set encoding = 6, datcollate = &#39;en_US.UTF8&#39;, datctype = &#39;en_US.UTF8&#39; where datname = &#39;template1&#39;; This changes the encoding of the templates from which new database are created. So before actually using UTF-8 you could list all databases with \\l and would see this: List of databases Name | Owner | Encoding | Collate | Ctype | Access privileges -----------+----------+-----------+---------+-------+----------------------- postgres | postgres | SQL_ASCII | C | C | template0 | postgres | SQL_ASCII | C | C | =c/postgres + | | | | | postgres=CTc/postgres template1 | postgres | SQL_ASCII | C | C | =c/postgres + | | | | | postgres=CTc/postgres After the encoding change it'll look like this: List of databases Name | Owner | Encoding | Collate | Ctype | Access privileges -----------+----------+-----------+------------+------------+----------------------- postgres | postgres | SQL_ASCII | C | C | template0 | postgres | UTF8 | en_US.UTF8 | en_US.UTF8 | =c/postgres + | | | | | postgres=CTc/postgres template1 | postgres | UTF8 | en_US.UTF8 | en_US.UTF8 | =c/postgres + | | | | | postgres=CTc/postgres Create a new database (CREATE DATABASE test_GIS;) and voil\u00e1: List of databases Name | Owner | Encoding | Collate | Ctype | Access privileges -----------+----------+-----------+------------+------------+----------------------- postgres | postgres | SQL_ASCII | C | C | template0 | postgres | UTF8 | en_US.UTF8 | en_US.UTF8 | =c/postgres + | | | | | postgres=CTc/postgres template1 | postgres | UTF8 | en_US.UTF8 | en_US.UTF8 | =c/postgres + | | | | | postgres=CTc/postgres test_gis | tka | UTF8 | en_US.UTF8 | en_US.UTF8 | Alternatively, ffmike published a Gist for this issue.",
            "tags": [

            ]
        },
        {
            "title": "Open Source Photoblog Workflow with phpGraphy",
            "slug": "open-source-photoblog-workflow-with-phpgraphy",
            "date": "30.11.2013",
            "permalink": "http://opengeodata.de/2013/11/30/open-source-photoblog-workflow-with-phpgraphy/",
            "text": "As I am lucky enough to be able to travel a bit in the near future, I thought of blogging a few pictures en route. I wanted to do this from my Androidphone (2.3) without the hassle of logging in some software or website, let alone to use some BS like Instagram. I\u2019ll try to explain in all briefness. The main idea behind this is to have some sort of sync-mechanism between the phone and the blog which automatically renders directories and pictures as HTML. I choose phpGraphy as my main tool. It supports a flat file database and renders any content in its /pictures/ folder based on date of creation. Since I had some concerns to put the password for my main webspace into the hands of some random Android-app I created a uberspace for my pictures. If you\u2019re in or near Germany I highly recommend these guys - great support, incredible features, almost anonymous. The next was to get BotSync on my phone which enables uploads via SSH. You just specify directory on your phone to be uploaded and the destination on the server - a very slim & fast app. So, I can take a picture, organize it via some file management app and upload it to my picture-uberspace. To get it on my main webspace I used rsync with SSH. You can find plenty of tutorials on the net regarding this. rsync -avze 'ssh -i /home/user/.ssh/id' user@host.de:/home/user/html/and-upload/ /home/user/html/phpgraphy/pictures Since phpGraphy sometimes messes up the thumbnail creation with the original pics of my phone\u2019s camera, I run mogrify to resize them. Because I want to resize every picture in any directory the \u2018find\u2019 will look for any file with a jpeg-extension; the mogrify will set any side of the pic to 1400px which is more than enough. find /home/user/html/phpgraphy/pictures/ -name \"*.jpg\" -exec mogrify -resize '1400x1400>' {} ; Of course, these tasks need to be automated. Cron will take over and run these two commands every morning at six o\u2019clock. crontab -e 0 6 * * * /home/user/sync.sh phpGraphy supports furthermore a cool feature concerning the naming convention. You can set it up to use a EXIF field for the image\u2019s name. A few apps like Photo Editor allow to edit the EXIF metadata on Android. So I edit the User Comment EXIF field and tell phpGraphy to use this field as title. Works! The workflow: Take picture --> (Edit EXIF data) --> Put into upload directory on phone --> run BotSync --> done. The good thing is I can change this workflow really easy. I can log on my picture-uberspace from any computer without worrying about giving away important credentials and just put some pics in the upload-folder. Then, my main uberspace will fetch the pictures via rsync and puts them in phpgraphy/pictures. Of course, one could say: just use some service other than Instagram if you don\u2019t like it. But this way I can control my data w/o worry about some TOS, pricing plans or whatnot. And it\u2019s more fun to use something you set up.",
            "tags": [

            ]
        },
        {
            "title": "Crunchbang Keyboard Layout",
            "slug": "crunchbang-keyboard-layout",
            "date": "20.11.2013",
            "permalink": "http://opengeodata.de/2013/11/20/crunchbang-keyboard-layout/",
            "text": "I looked for a simple way to change my keyboard layout on crunchbang linux (Waldorf) and found a post by Bogdan Costea and a thread in the FreeBSD forums. Following their information I did this: sudo gedit /etc/default/keyboard Besides XKBLAYOUT you add the desired languages (do ls -la /usr/share/X11/xkb/symbols/ for list of available languages). And XKBOPTIONS sets the keyboard shortcut. The FreeBSD-thread lists these possible shortcuts: grp:toggle \u2013 Right Alt grp:shift_toggle \u2013 Two Shift grp:ctrl_shift_toggle \u2013 Ctrl+Shift grp:alt_shift_toggle \u2013 Alt+Shift grp:ctrl_alt_toggle \u2013 Ctrl+Alt grp:caps_toggle \u2013 CapsLock grp:lwin_toggle \u2013 Left \"Win\" grp:rwin_toggle \u2013 Right \"Win\" grp:menu_toggle \u2013 Button \"Menu\" So my keyboard file looks currently like this: XKBMODEL=\"pc105\" XKBLAYOUT=\"de,fr,us\" XKBVARIANT=\"\" XKBOPTIONS=\"grp:menu_toggle\" Bogdan suggests using the small program fbxkb to visualize the current layout. Just add the following to ~/.config/openbox/autostart: ## Run indicator for keyboard layout fbxkb & This will show little language indicating png-graphics in the taskbar which look like text. To replace them with actual flags go to /usr/share/fbxkb/images and overwrite the png-files (need to be superuser). One convenient way are the famfamfam flags which use the same naming convention. Just overwrite everything inside the fbxkb-folder. Note: this still looks somehow crappy since fam\u00b3 flags are in 16x16 px and fbxkb displays in 24x24 px - but good enough for me. And btw: if you are looking to do a \u00e7 (cedille) or other special characters without big hassle check out the German Ubuntu wiki. To enter the french ligatures like \u0276 or \u00c6 press and hold Ctrl + Shift + u and enter the hexdecimal unicode - see selfhtml or shapecatcher for info on the codes.",
            "tags": [

            ]
        },
        {
            "title": "Set GPS update rate on Arduino Uno + Adafruit Ultimate GPS Logger Shield",
            "slug": "set-gps-update-rate-on-arduino-uno-adafruit-ultimate-gps-logger-shield",
            "date": "17.11.2013",
            "permalink": "http://opengeodata.de/2013/11/17/set-gps-update-rate-on-arduino-uno-adafruit-ultimate-gps-logger-shield/",
            "text": "Just in case someone wants to alter the GPS update rate on a Adafruit Ultimate GPS Logger Shield. This may come in handy if you want to reduce the power consumption of your board. According to a datasheet of the GPS-chip the max update rate is 10000ms/10sec. So how do you set it? Relatively simple: go to your Adafruit_GPS-library, open Adafruit_GPS.h and look for these lines: #define PMTK_SET_NMEA_UPDATE_1HZ \"$PMTK220,1000*1F\" #define PMTK_SET_NMEA_UPDATE_5HZ \"$PMTK220,200*2C\" #define PMTK_SET_NMEA_UPDATE_10HZ \"$PMTK220,100*2F\" So this will define the name of the method/function/whatever which will be called in your Arduino-sketch, e.g. _PMTK_SET_NMEA_UPDATE1HZ. PMTK220 is the chip-internal code for update rate. So we say: Hey, I want to alter the update rate. The value after the comma is the update rate in milliseconds. We set it to 10000 (or whatever you like). The value behind the is the checksum which the chip requires. Thanks to Stevens post about reading GPS data with bash I stumpled upon the MTK NMEA checksum calculator. So you put in PMTK220,10000 and get back _$PMTK220,100002F_. That\u2019s it. Our new line would read: #define PMTK_SET_NMEA_UPDATE_10SEC \"$PMTK220,10000*2F\" Just set PMTK_SET_NMEA_UPDATE_10SEC in your sketch and upload it.",
            "tags": [

            ]
        },
        {
            "title": "PirateBox - some tipps",
            "slug": "piratebox-some-tipps",
            "date": "27.09.2013",
            "permalink": "http://opengeodata.de/2013/09/27/piratebox-some-tipps/",
            "text": "PirateBox is, without a doubt, a great project. Nevertheless there are some things to consider and also some things to improve. I\u2019ll just make a short list of what I learnt. You are warmly invited to comment. I used the OpenWrt-version of Piratebox. 1) This might be obvious but I never conceived the notion of it until I worked with PirateBox and the TP-Link MR3020-Router: you\u2019re just dealing with linux. After SSH-ing into the router just be free to explore and play around. cd and ls the hell outta this thing. 2) Simplest mode of operating the box is either via wall socket or a battery. Note there are premade affordable 12V to 5V USB-converters available. Just search for \u201812v 5v usb\u2018 on ebay or somewhere else. 12V (car) batteries are available in your local electronics store (maybe even the converter). A 7000 mAh battery should give you about a day of operating off-grid. This will vary of course due to wireless usage, router type and battery quality. 3) Tech and \u2018open something\u2019 people like the word \u2018pirate\u2019 - it\u2019s freedom, it\u2019s controlling your destingy, taking what\u2019s yours, operating outside of incrusted structures. For other people it may be - at best - adventure tales and the pirate party (which has a arguable reputation) or - worse - illegal activity, stealing, hacking and so on. So, I decided to alter the SSID of my PirateBox. I called it Open Library - Share freely (instead of PirateBox - Share freely). To do this SSH into the router and follow these instructions. To mirror this information: Edit the wireless file on the router by vi /etc/config/wireless ([vi cheatsheet](http://www.lagmonster.org/docs/vi.html)) Look for the SSID option and alter the string ([allowed chars](https://forum.snom.com/index.php?showtopic=6785#entry16505)), save it and type /etc/init.d/network reload You should now be able to use your new SSID. I\u2019d always choose something welcoming; \u2018NSA-surveillance van\u2019 maybe not a good idea. ;) 4) Furthermore, I altered the landing page of PirateBox. For two reasons; first, the PirateBox logo without explanation may be intimidating for some people. Second, not everyone is able to read English on a level which is sufficient to be comfortable in this new context. So I changed to PirateBox logo to a pictogram I found on the PLA blog (Number 42). Less intimidating while preserving the notion of sharing. To change the logo as well as the text on the landing page you cd to /opt/piratebox/www/ ls -a You\u2019ll find index.html (landing page), piratebox-logo-small.png (the logo on the landing page) and .READ.ME.htm (the about page). Code snippets for German \u2018customisation\u2019 are below this post. The big logo on the about page stayed the same, since I wanted to give credit to the project. But how do you get this stuff on your computer to edit it? [scp](http://blog.linuxacademy.com/linux/ssh-and-scp-howto-tips-tricks/#scp) will help you. The article on scp explains it quite well, but just for the record: scp source target (the general idea behind scp) scp /opt/piratebox/www/index.html user@yourhost:/home/user/ (this will copy index.html into your home directory; of course, if you're already in the directory, just put the filename as source; you'll need the password for 'user' on your local machine) scp user@yourhost:/home/user/index.html /opt/piratebox/www/ (and copy the file back to the router; overwrites without warning!) Of course, you can edit all the files on the router with `vi` but it's more comfortable this way, I guess. So, edit the files the way you want - all you need is a bit HTML knowledge. I started with a little disclaimer that nobody is trying to hack the users computer or will try to do something illegal. But I think the localisation is the important part; make PirateBox accessible by using your local language. (Though, I'd leave the english version as it is to honour the work of David and to be accessible for international folks.) Well, that's it. Have fun with shared information on PirateBox and leave a comment. :) -------------- Snippets: `**index.html**` <div><img src=\"/lib.jpg\"/></div> <div id=\"message\"> <b>1.</b> Was ist das hier alles? <a href=\"/.READ.ME.htm\" target=\"_parent\"><b>Antworten hier</b></a>.<p> <b>2.</b> Lade etwas hoch. :) Einfach unten Datei auswaehlen und los geht's.</p> <b>3.</b> Anschauen und Runterladen des Vorhandenen kannst du <a href=\"/Shared\" target=\"_parent\"><b>hier</b></a>.<br> </div> `**.READ.ME.html**` <table border=0 width=50% cellpadding=0 cellspacing=0 align=center> <tr> <td width=\"75\"><br></td> <td><p>Erstmal: keine Angst - niemand hat vor dich zu hacken oder illegalem Treiben zu verleiten. :)</p> <p>PirateBox entstand aus den Ideen des Piratenradios und 'free culture movements' - Artikel darueber findest du auf Wikipedia. Ziel ist dabei ein Geraet zu erschaffen, welches autonom und mobil eingesetzt werden kann. Dabei setzt PirateBox auf freie Software (FLOSS) um ein lokales, anonymes Netzwerk zum Teilen von Bildern, Videos, Dokumenten, Musik usw. bereit zu stellen.</p> <p>PirateBox ist dafuer gemacht sicher und einfach zu funktionieren: keine Zugangsdaten, keine Mitschnitte wer wann auf was zugegriffen hat. PirateBox ist nicht mit dem Internet verbunden, sodass niemand (Nein, nicht mal die NSA) mitbekommt was hier geschieht.</p> <p>PirateBox wurde von David Darts erschaffen und steht unter einer Free Art License (2011). Mehr ueber das Projekt und wie Du dir einfach eine eigene PirateBox bauen kannst, findest du hier: http://wiki.daviddarts.com/piratebox</p> <p>Mit der Partei hat dies hier uebrigens nichts zu tun. ;)</p> <hr /> </td> <td width=\"25\"><br></td> </tr> </table>",
            "tags": [

            ]
        },
        {
            "title": "Convert youtube to audio",
            "slug": "convert-youtube-to-audio",
            "date": "20.04.2013",
            "permalink": "http://opengeodata.de/2013/04/20/convert-youtube-to-audio/",
            "text": "So, you want to archive all that cool music these crazy people put on youtube? Be my guest. :] First of all: check yourself before you wreck yourself. I will definitely not do this but some people trick you with code snippets. One can easily put some hidden characters via CSS into innocent commands that may look ok in the browser but do terrible stuff in your console. So before hitting the big enter button, read the code you are going to input into your terminal. But let\u2019s get to the fun. You need to get youtube-dl (readme); so on Ubuntu you may type: sudo apt-get install youtube-dlChances are good that youtube changed its API mechanisms since the last repo update, so run the internal update function. You need root privileges since the update wants to alter some stuff in /usr/bin/youtube-dl: sudo youtube-dl -UYoutube-dl should work now just perfect. In my case youtube-dl needs to do some batch stuff. I don\u2019t exactly want this (this gets one video and saves it to your hard drive): youtube-dl 7yJMLArxPaAI want this: youtube-dl -a batch.txtSo put all these nice videos respectively their video ID in a text file and run the command like above. The program will fetch one video after another and do some processing (if indicated). Now, you may want to play around with the options of youtube-dl (see readme). For example the -t option will use the title as filename; very handy. But we are only halfway there because what you have now is either flv or mp4 which contains the audio and video track. youtube-dl has an internal audio converter which relies on ffmpeg, so you can easily go like this: youtube-dl -a batch.txt -t -x \u2013audio-format \u201cwav\u201dThis will get the IDs in batch.txt, fetch the title and put it as filename and directly convert the videos to wav. For mp3-conversion you will have to install the MP3-support for ffmpeg: sudo apt-get install ffmpeg libavcodec-extra-53 The following command will give high quality mp3-files (which is by the way kind of unnecessary since you\u2019ll never get perfect quality on youtube; expect some stuff that sounds good on your mobile audioplayer and home system, but terrible on anything close to Hi-Fi; so you may want to save some disk space by setting the quality around 4 or 5-ish): youtube-dl -a batch.txt -t -x \u2013audio-format \u201cmp3\u201d \u2013audio-quality 0_But mp3 isn\u2019t cool. You know what\u2019s cool? 1 Billi\u2026 _err, ogg/vorbis, I mean. ;] So just enter something like this and get some patent troll-free musical goodness. youtube-dl -a batch.txt -t -x \u2013audio-format \u201cvorbis\u201d \u2013audio-quality 8Be aware, the quality parameter is reversed for ogg/vorbis - 0 is low quality, 9 is high quality. Furthermore, you may want to check out hardware with ogg support to not rely on mp3 patents (I once had a SanDisk player). If you have any questions, feel free to comment. Edit: To enhance your \u201cproductivity\u201d you may install a clipboard manager like parcellite (or any other), copy the URI of the desired videos, find the history file (in case of parcellite it resides here: ~/.local/share/parcellite/history; same goes for glipper), clean that file up a bit and use it as your batch.txt - cool, eh? Edit 2: Be careful with those playlist parameters (&list=xyz); youtube-dl tries to fetch the whole playlist. Playlists with hundreds or thousands of videos aren\u2019t unusual, so better limit the number of items to be fetched: youtube-dl yourURI \u2013playlist-end NUMBER Also, when messing around with playlists make sure to set the -i parameter. Youtube-dl will ignore errors like \u201cSorry, not available in your country\u201d.",
            "tags": [

            ]
        }
    ]
}