Given/When/Then

I recently started writing user stories and acceptance tests. My first drafts of acceptance tests were bad. There was little clarity around what I would actually be delivering to the client and a lot of ambiguity.

It’s important to make sure you and the client, whether that person is part of you own company or a company that you are consulting for, are on the same page. The last thing you would want would be to do a week’s worth of work and then have the client disagree with the deliverables.

The Given/When/Then strategy for writing user stories and acceptance tests made it significantly easier for me to think through all the features needed and estimate how long it would take to complete the features. Continue reading

How to Test Ruby IO

Receiving input from a user, doing something with that input, and then displaying some output is core to software development. This input and output is reffered to as IO.

In TDD, we should test the behavior of all the code we write, but testing IO can be challenging because we don’t want to be prompted for input or have miscellaneous text printed in the terminal while running our tests. I’m going to walk through a scenario where we have a UI class (i.e., user interface) in Ruby that handles all of the IO and explain how to test it. We’re also going to look at the IO class and its STDIN and STDOUT constants. Continue reading

Self & Metaclasses in Ruby

Self has confused me for a while now so I decided to get to the bottom of the purpose and meaning of self.

TLDR:

  • At any point in your program, there is one and only one self
  • Self is the current object accesible to you
  • It can also be said that Self is the receiver object of the current method (ex: String is the object in String.send(:”length”))
  • Since objects in Ruby are usually instances of classes, self is usually the class you are in at any moment
  • Using self enables you to call a method on a class (like Person), instead of just one instance of a class (like Laura)
  • There are a couple different ways to call methods on self: ‘self.some_method’ OR ‘class << self’ before a group of methods
  • You might want to use self if you want to call a method on any potential instance of that object (every Person is born on planet earth but not every Person is named Laura), especially if you don’t plan on having more than one instance of an object (maybe you’re creating a game with one Person, one Fish, and one Monkey–I’m not sure why you would do that–and you don’t want to create multiple instances of Person).

Okay, now for the longer explanation. Continue reading

Single Responsibility Principle

There’s a group of guidelines and principles to help us write clean code in object-oriented programming–commonly called SOLID–which was introduced by Robert Martin (aka Uncle Bob).

The five principles of SOLID are:

  1. Single Responsibility Principle
  2. Open-Closed Principle
  3. Liskov Substitution Principle
  4. Interface Segregation Principle
  5. Dependency Inversion Principle

Right now I’m going to explain the Single Responsibility Principle with a real world example. Continue reading

Git & GitHub Explained

At my last job I often heard the word GitHub thrown around by developers. I knew that their code was “on” GitHub, but that was just about all I knew. So when I started learning to code I quickly created a GitHub account (with my favorite username yet: gitlaura) and went through the practice exercises to learn how to use it. I started pushing whatever I was working on to my GitHub account.

Even though I was typing ‘git’ into my command line for commands like ‘git add’, ‘git commit’, and ‘git push’, I had no idea that git was separate from GitHub until recently. In fact, I assumed that GitHub somehow created that functionality on my computer.

Well, news flash: Git and GitHub are not the same. I honestly feel a little silly even sharing this with you, but I figure there must be other self-taught programmers out there who don’t understand the difference between Git and GitHub. So let me explain. Continue reading

The Need for Speed: Big-O Notation (Part 1)

Big-O notation sounds like a scary concept. I think it’s the mathy-word ‘notation’ that makes it so, but it’s actually not that difficult to wrap your mind around.

Defining Big-O Notation
Big-O notation is used to classify algorithms by how they respond (e.g., the time it takes to process) to changes in input size. Continue reading

Not sure where to start? Start simple with TDD.

I’m new to Ruby and to Test Driven Development (TDD). Good thing the latter is helping me learn the former and write better code in the process.

I recently went through the Roman Numeral Kata shown by Jim Weirich here to get more acquainted with TDD. This kata showed me that through TDD I could write pretty, complex (hey, for me this is complex) code without much difficulty. Continue reading