Tuesday, November 27, 2012

Configure Git, Like a Boss

I have just created a Git presentation. The presentation is named Git, Practical Tips, and it contains good practices that I have picked up during my four years as a Git user.

The presentation consists of six parts, A quick introduction; History manipulation with merge, rebase and reset; Finding with Git; Configuration; Under the hood; and Interacting with Github.

If you find this interesting and would like to hear a very practical presentation about Git tips and tricks, feel free to contact me :)

In this post I will describe how to configure Git to work well from the command line. It consists of two main parts, Git configuration and Bash configuration.

I will only describe some select samples of my configuration here. If you want to see more, my configuration files are on Github.

Git Configuration

The Git configuration part is just a bunch of aliases I use. Some are simple and some are more advanced. The aliases are declared in my global git config file, ~/.gitconfig under the [alias] tag. Here are some of most important ones.

git add --patch

[alias]
ap = "add --patch"

git ap (git add --patch) is awesome. It lets me add selected parts of the changes in my working directory, allowing me to create a consistent commit with a simple clear commit message.

git add --update

au = "add --update"

git au adds all the changed files to the index. I use it mainly when I forget to remove a file with git rm and instead remove it with rm. In this case Git will see that the file is missing but not staged for removal. When I run git au it will be added to the index as if I had used git rm in the first place.

git stash save -u

ss = "stash save -u"

git ss stashes everything in my working directory, including untracked files (-u). The reason I use git stash save instead of just git stash is that it allows me to write a message for the stash, similar to a commit message.

git amend

amend = "commit --amend -c HEAD"
amendc = "commit --amend -C HEAD"

git amend lets me add more changes to the previous commit. It is very useful when I forget to add a change to the index before I commit it. It amends the new changes in the index and lets me edit the old commit message. git amendc does the same thing but reuses the old commit message.

git alias

alias = "!git config -l | grep alias | cut -c 7-"

git alias shows me all my aliases. Starting with a bang (!) is necessary to execute arbitrary bash commands. Note that the git command must be included. The code in the alias means, list configuration, find aliases, show characters 7 and on.

git log --diff-filter

fa = "log --diff-filter=A --summary"
fd = "log --diff-filter=D --summary"

git fa (find added) and git fd (find deleted) shows me a log of commits where files were added and deleted respectively. It is great for finding out how and when my files get deleted. I use it with a filename git fd my-missing-file.rb or with with grep, git fd | grep -C 3 missing.

grep -C 3 means shows me 3 lines of context around the matching line.

git log-pretty

l = "!git log-hist"
log-hist = "!git log-pretty --graph"
log-pretty = "log --pretty='format:%C(blue)%h%C(red)%d%C(yellow) %s %C(green)%an%Creset, %ar'"

git l is my main logging command it and prints a beautiful compact log. When I reuse an alias I must use the shell command alias, the bang (!), since Git does not allow me to reference an alias from another directly.

log               = The log command
--graph           = Text-based graphical representation
--pretty='format' = Format according to spec
%C(color)         = Change color
%h                = Abbreviated commit hash (6b266c2)
%d                = Ref names (HEAD, origin/master)
%s                = Subject (first line of comment)
%an               = Author name
%ar               = Author date, relative

git log --simplify-by-decoration

lt = "!git log-hist --simplify-by-decoration"

git lt (log tagged) uses --simplify-by-decoration to show a list of "important" commits. Important in this case means commits that are pointed to by a branch or tagged. It reuses the log-hist alias above.

Bash Configuration

I used to have a bunch of aliases, such as ga, gd, etc. but, now I use my Git aliases instead. But I still have configuration for command completion and a nice informative prompt.

function g()

I use the git command more than any other command during a days work. git status is the subcommand I use mostly. I have optimized for this by creating a function g() that has status as its default argument.

# `g` is a shortcut for git, it defaults to `git s` (status) if no argument is given.
function g() {
    local cmd=${1-s}
    shift
    git $cmd $@
}

The g() function gives me a lot of power out of a single character.

$ g
## master
 M README.md
?? doc.md

$ g l
* 4f71f8d (HEAD, heroku/master, master) Send 404 for missing ...
* ec00879 Added support for options Anders Janmyr, 5 weeks ago
* 09c178f (origin/master) id cannot be a number Anders Janmyr, 6 weeks ago
* e561d03 Send status and send in one call Anders Janmyr, 6 weeks ago
* 9615be5 Added some more logging Anders Janmyr, 6 weeks ago
* de4730e Improved the code somewhat Anders Janmyr, 6 weeks ago
* 1f3f763 Added allow methods header Anders Janmyr, 6 weeks ago
* ca3065c Added filter to documentation Anders Janmyr, 6 we

function gg()

My second (and last) function is gg().

# Commit pending changes and quote all arguments as message
function gg() {
    git ci -m "$*"
}

gg() allows me to type a commit message without any quotes.

$ gg Added todo list to the Readme
[master 98556af] Added todo list to the Readme
 1 file changed, 1 insertion(+)

bash-completion

Installing bash-completion gives me command completion for commands, subcommands and more.

# An example
$ git rem<TAB> sh<TAB> o<TAB>
# will complete to
$ git remote show origin

I use Homebrew to install Git, brew install git. It gives me a new version of Git. It also installs git-completion.bash in /usr/local/etc/bash_completion.d/.

I use the same configuration on Ubuntu and I check for the file in /etc/bash-completion.d/ too.

# Prefer /usr/local/etc but fallback to /etc
if [ -f /usr/local/etc/bash_completion.d/git-completion.bash ]
then
    source /usr/local/etc/bash_completion.d/git-completion.bash
elif [ -f /etc/bash_completion.d/git ]; then
    source /etc/bash_completion.d/git
fi

This is great but, what about my beautiful little g() function? How do I make it work with command completion? It turns out to be quite easy. Include the following little snippet in a configuration file, such as .bashrc.

# Set up git command completion for g
__git_complete g __git_main

The snippet reuses the functions, __git_complete and __git_main included with git-completion.bash to make completion work with g too. Lovely!

bash-prompt

In later versions of Git, the prompt functionality has been extracted out into its own script, git-prompt.sh. I include it like this.

if [ -f /usr/local/etc/bash_completion.d/git-prompt.sh ]
then
    source /usr/local/etc/bash_completion.d/git-prompt.sh
fi

I configure my prompt like this, it contains a little more magic than the plain Git configuration. I put it in one of my bash configuration files, such as .bashrc.

function prompt {
  # Check exit status of last command
  if [[ "$?" -eq "0" ]]; then
    # If it is OK (0) color the prompt ($) green
    local status=""
    local sign=$(echo -ne "\[${GREEN}\]\$\[${NO_COLOR}\]")
  else
    # If not OK (not 0) color the prompt ($) red and set status to exit code
    local status=" \[${RED}\]$?\[${NO_COLOR}\] "
    local sign=$(echo -ne "\[${RED}\]\$\[${NO_COLOR}\]")
  fi
  # Get the current SHA of the repository
  local sha=$(git rev-parse --short HEAD 2>/dev/null)
  # Set the prompt
  # \!                 - history number
  # :                  - literal :
  # \W                 - Basename of current working directory
  # $sha               - The SHA calculated above
  # $(__git_ps1 '@%s') - literal @ followed by Git branch, etc.
  # $status            - The exit status calculated above
  # $sign              - The red or green prompt, calculated above
  export PS1="[\!:${LIGHT_GRAY}\W${NO_COLOR} $sha${GREEN}$(__git_ps1 '@%s')${NO_COLOR}$status]\n$sign "
}

# Tell bash to invoke the above function when printing the prompt
PROMPT_COMMAND='prompt'

The function __git_ps1() can further be configured with some environment variables. This is what use.

# Git prompt config
export GIT_PS1_SHOWDIRTYSTATE=true
export GIT_PS1_SHOWUNTRACKEDFILES=true
export GIT_PS1_SHOWUPSTREAM="auto"
# export GIT_PS1_SHOWSTASHSTATE=true

The resulting prompt looks like this:

The different signs to the right indicate:

# * - Changed files in working dir
# + - Changed files in index
# % - Untracked files in working dir
# < - The branch is behind upstream
# > - The branch is ahead of upstream (Yes, it can be both)

More info can be found in git-prompt.sh.

Credits

Obviously I have not figured this out all by myself. Here are some of my sources:

Sunday, November 11, 2012

The Double Rainbow at Øredev 2012

At Øredev I had a discussion with some people about how the double rainbow comes about so in the spirit of sharing knowledge I thought I would share how it works.

The first rainbow comes from light being reflected once in the raindrop and similar to a prism the refraction of the light in the drop splits the white light into different colors.

Compare with the prism on Pink Floyd's Dark Side of the Moon.

The secondary rainbow on the other hand comes from the light being reflected twice inside the drop.

This is the reason the colors of a real secondary rainbow are inverted.

If you find this kind of stuff interesting I can recommend the book Light and Color in the Outdoors. It is expensive, so you may want to borrow it from the library (in Sweden) instead.

Friday, November 09, 2012

Øredev 2012, Notes from Day Two

The Rebellion Imperative, Reginald Braithwaite

Every daring attempt to make a great change in existing conditions, every lofty vision of new possibilities for the human race, has been labelled utopian. -- Emma Golden

Wealth breeds inefficiency

Since a wealthy man can buy anything he wants he has no need for efficiency.

The powerful enforce statis.

Progress is against their best interests.

Stasis, the state of equilibrium or inactivity caused by opposing equal forces. -- dictionary.com

The four sustainable positions from Marketing Warfare

  • The Leader Defends
  • The Rival Attacks
  • The Innovator Disrupts
  • The 99% are Rebels

Android is for people that don't need to be cool!

Don't attack their weakness, attack their strength!

Principles of Rebellion

  • Find a territory small enough to defend - and don't overextend yourself.
  • Never act like the leader.
  • Be prepared to bug out at a moment's notice.

God fights on the side with the most cannons. -- Napoleon

When you are being rebellious, you don't want a lot of dependencies, like infrastructure etc. You need to be in control of your own destiny.

Bug out

You can't fall in love with what you are doing, because most of the time it will not work out.

A rebel does not have resources to waste on a lost cause.

Rebels are lean

  • Rebels have lean businesses.
  • Rebels use lean tool chains.

Disruption

If there is an existing market, you are doing it wrong.

Analysts are wrong because they are corrupt motherfuckers who are paid to make press releases. -- Reginald Braithwaite

You can't ask customers, you have to tell them!

Micro Service Architecture, Fred George

Unit test counts are important, the more you get working the more you show progress.

1M loc -> 100K loc trying to get out -> 20 * 5K loc services, trying to get out

Only nuggets should be published by the services. Nuggets are small pieces of information not the whole blob. If you need everything go get it from the source.

Clients that are not desperate will not try anything interesting at all.

Bayesian Service Principles

  • It's okay to run more than one version at the same time.
  • You can only deploy one service at at time.

Observations

  • Services are like classes, small, crisp conceptualization, 100 loc.
  • Smalltalk messages perfect for services.
  • Encapsulation, every service own the data.
  • Services became disposable, no need for test, just rewrite them.
  • Loosly coupled via RESTful JSON
  • Self monitoring services replaces Unit Tests.
  • Business monitoring replaces acceptance tests.
  • Service language-agnostic, Ruby, Node, Clojure

Problems

  • Cycle, lost packets
  • Need to redefine tracking, logging
  • No experience in defining services.

Data Ecosystems

Collect all the data, In The Plex

  • Consumers, R, Hive, Monitoring, Apps, Services
  • Producers, app, app, service, web server

Conclusions

  • Agile Best Practice Not Used
  • Technical Debt does not appear, services are so small.

Principles

  • Very small
  • Loosely coupled
  • Multiple versions acceptable
  • Self-execution monitoring of each service
  • Publish interesting stuff
  • Application is a poor conceptualization

Living Software System

  • Long-lived system, short lived services
  • Extremely dynamic with continuous deployment, 5-10 minutes between deploys.
  • Accept it is complex (especially for testing)
    • Acceptance test on business outcomes instead.
  • Radical impact to processes, anarchy
  • There will be a learning curve for developers!

Scalable and Modular CSS FTW, Denise Jacobs

Denise gave an overview of some other people's ideas. The ideas come from Object Oriented CSS, SMACSS, and CSS For Grown Ups.

OO CSS (I think)

DRY, Don't Repeat Yourself, Never repeat a style property if you can Group selectors with shared properties. Ask yourself why isn't this part of the group?

  • Use a grid
  • Use a reset CSS
  • Use the cascade
  • Separate structure and presentation
  • Style classes rather than elements
  • Don't use IDs for styling, only for JS hooks
  • Avoid class names that are related to the style.

SMACSS

Categories of styles

  • Base
  • Layout
  • Module
  • State
  • Theme

Shallow Selectors

  • Use class names as the right-most key selector.
  • You're better of adding classes than having overly-specific selectors.
  • Module candidates navbars, carousels, dialogs, widgets, tables, icons
  • Naming conventions are very important. Examples is-active, is-collapsed, btn-pressed

CSS for Grown Ups

Don't style pages, instead style modules and make a library that you can reuse.

Categories of Styles

  • document
  • base
  • module
  • layout

Javascript styles prefixed with js-

Common ideas

  • IDs not so much
  • Classes are good
  • Less is more
  • Use modules
  • Naming conventions? !important

  • Structure and inform, naming conventions, grids

  • Reduce, no inline styles, short chains, don't use elements, prefer classes over IDs

  • Recycle and Reuse, Leverage the cascade, modularize page components, extend through sub classing.

Measure twice, Cut once.

Techniques for improving CSS

  • Optimize Selectors
  • Reduce the redundant
  • Clear the cruft,
    • eliminate style that rely on qualifiers high in the domA
    • Create portable styles
    • Institute a Grid, box-sizing, make it fluid max-size: 100%
  • Separate the modules into separate files.

New best practices

  • normalize.css
  • Use a better Clearfix
  • Use an <hr> as a divider
  • Image replacemenet
  • Use an <i> as an icon element.

Less, the Path to Good Design, Sandi Metz

  • The Design is the Code
  • Code need to work once and be easy to change forever
  • The purpose of design is to minimize the cost of change.
  • Loosely coupled, highly cohesive, easily composable, context independent
  • Transparent, Reasonable, Usable, Exemplary
  • Design is never perfect, the code is never done.
  • Done is good enough!

  • Sequence diagrams are great for object oriented design.

  • Managing dependencies is at the center of design.

  • The objects of which you are most uncertain are probably the ones that are at the center of design, they are your core objects.

  • Uncertainty, shows you where to decouple.

  • Trustworthy objects knows more of what they want and less of what others want.

  • Practical Object-Oriented Design in Ruby

Typescript, Mads Torgerson

Typescript is a typed superset of Javascript that compiles to plain Javascript. It adds optional static types, classes, and modules. It compiles to idiomatic Javascript.

The type system

Being a superset means that raw Javascript is allowed in Typescript. Since typing is optional errors should be seen as warnings. The language supports type inference which gives you type safety without declaring all variables.

The biggest advantage of the static typing is that it gives good code completion. The type system is structural, similar to Go, and that allows duck typing like calls. It supports optional properties so it is more about guiding you than forcing you to follow strict rules.

The functions in Typescript supports overloading, i.e. foo(x: string) and foo(x: number). Functions can be declared to have properties.

It is also possible to declare index accessors [] on the objects, declared like this.

foo = {
  [x:string]
}

Typescript supports type inference on the built in Javascript objects by means of declare files provided by the IDE.

Classes and Modules

Classes in Typescript follows the Ecmascript standard, and adds types, privates and statics.


class Greeter {
    private greeting: string;
    constructor (message: string) {
        this.greeting = message;
    }
    greet() {
        return "Hello, " + this.greeting;
    }
    static default = new Greeter('Hola');
}   

var greeter = new Greeter("world");

Typescript also supports lambdas => and it provides lexical scoping of this similar to what Ecmascript will provide

Modules

Modules are declared with the module keyword and export properties explicitly.

module Sayings {
    export class Greeter {
        greeting: string;
        constructor (message: string) {
            this.greeting = message;
        }
        greet() {
            return "Hello, " + this.greeting;
        }
    }
}
var greeter = new Sayings.Greeter("world");

Datomic, Tim Ewald

Fact, an event or thing known to have happened or exists Datoms, a tuple with entity, attribute, value, time

Everything can be encoded by datoms.

A collection of facts

  • An append-only model
  • Data is immutable
  • Database state at given time is a value

Datomic Architecture

  • Separate read, write, and storage.
  • All apps reads from a local cache.
  • A transactor supervises the writes and reflects it out to the peers.

Reading

  • Read / query db value in the peer.
  • Long-running work on consistent image.
  • Opening query to clients with bulkheads
  • Cross-database joins

Query

  • Datalog-based query language
  • Can invoke code
  • Can define views using rules
  • Can invoke application specific code
  • Supports querying the past and a possible future.

Direct Access

  • Entity Maps
  • Direct Index Access via iteration

Writes

  • Writes managed by transactor, single point of failure
  • True ACID semantics, transactor functions can be stored in the database.
  • Reified transactions can be queried as normal data.
  • Transaction notifications can be listened to by the peers.
  • Transactions return db-before, db-after, tx-data, tx-id

Storage

  • mem, dev, ddb, sql, infinispan, riak, couch, etc.
  • Pick a service that provides the scale and availability you need
  • Migrate as needed with backup and restore.

Thursday, November 08, 2012

Øredev 2012, Report from Day One

Software won, so what now? - David Rowan

A healthy disregard for the impossible.

Software developers have the power to make a massive impact. We should do something other than help people pass the time or sell more on the Internet!

We should do this by making it easier for non software developers to write programs. One solution, David believes, is to create new languages, for example DOG!? I don't agree with him that this is a good way forward, Applescript anyone?

The Internet can be used to create greater transparency for government and companies. Some examples

  • I paid a bribe web site.
  • Thank doctors who works without taking bribes.
  • Making transparent elections.
  • TED, Ideas worth spreading.
  • Publicising data to get help with data mining.
  • Random hacks of kindness
  • Education, Udacity, CoderDojo, Rails & Girls
  • Government, Street Bump, Don't Eat At, Open Government API
  • Health, Asthmapolis

Vim, Precision Editing at the Speed of Thought, Drew Neil

We spend a lot more time editing text than inserting text, so it is only appropriate that the keys can be used without modifier keys.

Vim is more like playing a melody than playing a chord.

Targets

The ideal way to move the cursor to the target is to look at the place where you want the cursor to go and it would just go there.

The mouse is very intuitive for finding targets, but it requires precision and breaks your typing flow.

Hunt and Peck is the standard interface in a less capable editor if you are not using the mouse. That means using the arrow keys to find the proper plays, possibly after moving to an area closer to the spot you want first.

Vim provides precision targetting, short commands that allows you to move to the spot you want. An example of this is the f command.

f{char} moves you to the first occurence of of the letter char.

fb moves you to the first b on the line.

Vim also have something called text objects. Text objects are pieces of text delimited by characters such as quotes, parentheses, or tags.

vi{delimiter} Visually selects the text Inside of the delimiter. va{delimiter} Visually selects All the text including the delimiter.

`vi)` selects the text inside the parentheses.
`di)` deletes the text inside the parentheses.

`va"` selects the text and the quote.
`da"` deletes the text and the quote.

Vim allows navigation between files with commands like:

gf to move to the filename under the cursor.

ctags is an external indexing program that allows you to go directly to definition and other navigations common to a lot larger IDEs.

This was a good introductory presentation to Vim, but nothing new for experienced Vimmers.

Software in the Age of Sampling, Brian Foote

Most new software development is really taking sampling and mashups. We take libraries or code snippets that already work and tweak them to work the way we want.

Copy and paste is a really powerful technique to get us started. We all do it all the time.

If you are going to steal, steal from the best

But, when you steal make sure it works by understanding it or putting it under test. Also make sure to add your own style to what you steal to make it fit with your own code base.

What's new in ASP.NET 4.5, Damian Edwards

ASP.NET 4.5 comes with some good stuff built in, OAuth, NuGet, Asset handling, and async support.

OAuth authentication supported with different providers. Supports connecting and disconnecting the local account from the external providers.

Package manager is pre-configured to use NuGet.

Support for concatenation and minification of Javascript via Bundles. Support for precompilation and minification of CSS via external Bundle plugins.

Async

  • Make async handlers by extending HttpTaskAsyncHandler or Forms.Page
  • HttpClient is exclusively async.
  • Task.WhenAll and Task.WhanAny to handle parallel tasks synchronization.
  • CancellationTokens help you cancel many tasks at the same time.

Microsoft has done a good job integrating async into DotNet.

Testing Online Crazy Glue, Chris Hartjes

Testable applications are better!

Chris introduced us to testing PHP. The sad news is that even though PHP is a dynamic language it has to be tested as if it is static.

The good news is that testing static languages is a solved problem. It is solved by tools, strategies, and automation.

  • The tool of choice is PHPUnit.
  • The strategy is dependency injection and mocks.
  • Automation is solved with a simple script, and version control hooks or a CI-server.

It is not hard to do TDD in PHP, the tools are there, use them!