"Programming Scala" Is Now Available 154

Posted by Dean Wampler Mon, 21 Sep 2009 22:48:00 GMT

Programming Scala is finally available online at Amazon, O’Reilly, Safari, and hopefully at other online and physical bookstores, too. If you have trouble finding it, just memorize the ISBN, 9780596155957, to aid your search. ;)


You can download the complete code examples here. If you want to “try before you buy”, see our labs site.

I pitched the book idea just over one year ago. Alex Payne, my co-author, was also talking to O’Reilly about a book, so we joined forces. It’s been a fast ride.

This book is for you if you are an experienced developer who wants a comprehensive, but fast introduction to Scala. We try to convince you why we like Scala so much, but we also tell you about those dark corners and gotchas that you’ll find in any language. We even preview the forthcoming 2.8 version of Scala. I hope you’ll give Programming Scala a look.

QCon SF 2008: Radical Simplification through Polyglot and Poly-paradigm Programming 143

Posted by Dean Wampler Tue, 15 Sep 2009 15:35:00 GMT

InfoQ has posted the video of my talk at last year’s QCon San Francisco on Radical Simplification through Polyglot and Poly-paradigm Programming. I make the case that relying on just one programming language or one modularity paradigm (i.e., object-oriented programming, functional programming, etc.) is insufficient for most applications that we’re building today. That includes embedded systems, games, up to complex Internet and enterprise applications.

I’m giving an updated version of this talk at the Strange Loop Conference, October 22-23, in St. Louis. I hope to see you there.

A Milestone for "Programming Scala" 78

Posted by Dean Wampler Wed, 08 Jul 2009 02:02:00 GMT

My co-author, Alex Payne, and I hit a major milestone today for Programming Scala. After a feverish holiday weekend of writing, we finished all the remaining sections of the book. This morning, we released it to O’Reilly’s crack production team. “If you love something, set it free” or something sappy like that. Of course, the corollary is, “if it doesn’t come back, hunt it down and kill it…”

But I digress. There will be more reviews and final edits. You can still add your comments online using the link above. However, the text is essentially done. It has started the final journey that will turn our words into a book, something of a life-long dream of mine that I waited too long to achieve. Kudo’s to Alex for pursuing this dream early in his career. If it’s a dream you have entertained, know that it’s never too late.

On the other hand, whatever modest qualities the book possesses reflect the combined years of experience that Alex and I have acquired, sometimes painfully. In a way, as I reflect on what I wrote, Programming Scala is a software design book masquerading as a language book. The truly seductive quality of Scala is that it makes elegant design concepts possible, which is why I’ve placed so much faith in the language.

Alex posted his own thoughts on the project, which I hope you’ll take the time to read. I’m grateful for his insights and experience with Scala, the elegance of his prose in the book, and the great work he’s done at Twitter, which caused me to get so addicted to Twitter that I spent too much time tweeting over the last year, which meant I had to work my proverbial butt off this past weekend to get the book done. Thanks a lot!!

Rich Hickey on Testing 177

Posted by Dean Wampler Fri, 05 Jun 2009 17:40:00 GMT

It was an interesting week at JavaOne, with lots of talks and hallway discussions about new languages on the JVM. One of those languages is Clojure.

Rich Hickey, the creator of Clojure, gave a talk at the Bay Area Clojure User Group Wednesday evening. During the Q&A part, he said that he’s not big on writing tests, although he always runs the tests that other people have written before he commits changes.

Of course, there are many people, including us Object Mentors, who consider TDD to be an essential part of professional software development. Obviously not everyone agrees. James Coplien has been another critic of this view.

We should never accept any dogma. Why is TDD considered important? What does it purport to do? TDD provides two important benefits.

  • Driving the design.
  • Building a suite of automated regression tests.

So, if you can satisfy both requirements without TDD, then technically you don’t need it. In Rich’s case, he said he spends a lot of time thinking about what he’s going to do before he does it. In this way, he satisfies the first requirement, driving the design. I had a spirited discussion with some Thoughtworkers afterwards and Ola Bini said what a lot of us think, “I do that thinking by writing tests.” I’ll freely admit that when I am really experimenting with ideas, I might just write code, but once I know how to proceed, I return to the beginning and test drive the “production” code.

Rich also made an off-hand comment that if he screws something up, he’s got thousands of users who will let him know! That ex post facto testing, along with the Rich’s own devotion to doing high-quality work, does a good job of handling regressions.

But Rich mentioned something else that is also very important. In a functional language, where values are immutability and mutable state is handled in specific, principled ways, regressions don’t happen nearly as often. Clojure has one of the most deeply thought out approaches for handling state, which is the genius of Clojure.

I asked Rich how long he worked on Clojure before releasing it to the world. He spent about 2 1/2 years, much of that time working exclusively on Clojure (and eating through his savings). When he finally “announced” it, his “marketing” consisted of one email to some friends in the Common Lisp community. The rest was viral, a testament to the justified excitement Clojure has generated.

For me, I’ll probably always do my design thinking through tests, especially when I’m writing code in imperative languages, like Java and Ruby. I’ll continue to encourage my clients to use TDD, because I find that TDD is the most productive way to achieve high quality. I want the safety net of a good test suite. I’m also writing more and more of my code in a functional style, with minimal side effects and mutable data. You should, too.

Bay-Area Scala Enthusiasts (BASE) Meeting: What's New In Scala 2.8 139

Posted by Dean Wampler Fri, 05 Jun 2009 07:13:00 GMT

This week is JavaOne in San Francisco. The Bay-Area Scala Enthusiasts (BASE) held their monthly meeting. Martin Odersky, the creator of Scala, was the special guest. He discussed what’s new In Scala 2.8, followed by Q&A. We met at Twitter HQ.

These are my notes, focusing primarily on Martin’s presentation, and filled in afterwards with additional details. Any transcription errors or erroneous extrapolations are my own fault. It’s also late in the day…

Some of the features are not yet in the SVN trunk, so don’t assume my examples actually work! See the scala-lang.org for more details on Scala 2.8 features.

There are a few more months before it is released. A preview is planned for July, followed by the final release in September or October.

New Features

Here are the new features for this release.

Named and Default Arguments

Scala method parameters can be declared to with default values, so callers don’t have to specify a value and the implicit convention doesn’t have to be used. The default “values” aren’t limited to constants. Any valid expression can be used. Here is an example that I made up (not in Martin’s slides) that illustrates both specifying and using one default argument and using named arguments.

    
def joiner(strings: List[String], separator: String = " ") = strings.mkString(separator)

val strs = List("Now", "is", "the", "time", "for", "all", "good", "men", "...")
println(joiner(strs))
println(joiner(strs, "|"))
println(joiner(strings = strs, separator = "-"))
    

Named and default arguments enable an elegant enhancement to case classes. It’s great that I can declare a succinct value class like this.

    
case class Person(firstName: String, lastName: String, age: Int)
    

What if I want to make a copy that modifies one or more fields. There’s no elegant way to add such a method in 2.7 without implementing every permutation, that is every possible combination of fields I might want to change. The new copy method will make this easy.

    
case class Person(firstName: String, lastName: String, age: Int)

val youngPerson = Person("Dean", "Wampler", 29)
val oldPerson = youngPerson copy (age = 30)
    

I’m using the infix notation for method invocation on the last line (i.e., it’s equivalent to ... youngPerson.copy(...)). I can specify any combination of the fields I want to change in the list passed to copy. The generated implementation of copy will use the current values of any other fields as the default values.

The implementation looks something like this.

    
case class Person(firstName: String, lastName: String, age: Int) {
  def copy (fName: String = this.firstName, 
            lName: String = this.lastName, 
            aje: Int = this.age) = new Person(fName, lName, aje)
}
    

Quite elegant, once you have default and named arguments!!

Defaults for parameters can’t refer to previous parameters in the list, unless the function is curried. (I’m not sure I got this right, nor do I understand the reasons why this is true – if it’s true!)

By the way, Martin reminded us that method parameters are always evaluated left to right at the call site. Do you remember the rules for Java, C++, C#,...?

Nested Annotations

Annotations can now be nested, which is important for using some of the standard annotation definitions in the JDK and JEE. This feature also exploits named and default arguments.

    
@Annotation1(foo = @Annotation2)
    

Package Objects

People have complained that they want to define top-level definitions for a package, but they have to put those definitions, like types and methods, in an object or class, which doesn’t quite fit and it’s awkward for referencing through package and type qualification. The problem was especially obvious when the team started working on the major reorganization of the collections (discussed below). So, Scala 2.8 will support “package objects”.

    
package object scala {
  type List[+T] = scala.collection.immutable.List
  val List = scala.collection.immutable.List
}
    

Our friend List is now moved to scala.collection.immutable.List, but we would still like to reference it as if it were in the scala package. The definition defines a package-level type and val the effectively make List accessible in the scala scope. In Scala 2.7 you would have to do something like the following (ignoring Predef for a moment).

    
package scala {
  object toplevel {
    type List[+T] = scala.collection.immutable.List
    val List = scala.collection.immutable.List
  }
}
    

But then you would have to reference List using scala.toplevel.List.

Now, they got around this problem previously by putting a bunch of stuff like this in Predef and importing it automatically, but that has several disadvantages.

  • Predef is a big, amorphous collection of stuff.
  • You can’t define your own Predef with the same convenient usage semantics, i.e., no special import required and no way to reference definitions like package.type. You would have to use the alternative I just showed with toplevel in the middle.

Package objects give you a place for definitions that you want to appear at the package scope without having to define them in a singleton object or class.

Finally, besides types and fields as shown, package objects can also define methods. They can also inherit from traits and classes.

@specialized

Scala generics are fully specified at declaration time using a uniform representation, not when they are used, like C++ templates. This supports the way Java works, where there isn’t a giant link step to resolve all references, etc. However, this has a major performance disadvantage for generic types when they are actually used with AnyVal types that Scala optimizes to primitives.

For example, any closures require the use of FunctionN[T1, T2, ...], e.g.,

    
def m[T](x: T, f: T => T) = f(x)

m(2, (x:Int) => x * 2)
    

The f closure in the definition of m will require instantiation of Function2[T,T]. However, when use AnyVal classes, as in the last line , this has the effect of causing primitive boxing and unboxing several times, hurting performance when this is completely unnecessary in the special case of primitives being used. This is also bad for arrays and some other data structures.

The new @specialized annotation fixes this problem by causing scala to generate different versions of the user-specified generic type or method for each of the primitive types.

    
def m[@specialized T](x: T, f: T => T) = f(x)

m(2, (x:Int) => x * 2)
    

There is a real risk of an explosion of code. Consider what would have to be generated to support every type permutation for Function22! For this reason they only do cases with up to two type parameters in the library. You can also choose to annotate only some of the type parameters, as appropriate, and the annotation will support parameters that let you limit the primitive types that will be supported, e.g., only Ints and Longs.

This feature is not yet in the 2.8 trunk, but it will be soon.

Improved Collections

Collections are getting a major revamp. First they want to eliminate gratuitous differences in package structure and implementations. In many cases, the map method and others have to be redefined for each basic collection type, rather than shared between them.

New Collections Design

The new version of the library will support the following.

  • Uniform structure.
  • Every operation is implemented only once.
  • Selection of building blocks in a separate package called scala.collection.generic. These are normally only used by implementers of immutable and mutable collections.

Because of the reorganization, some Scala 2.7 source code won’t be compatible with 2.8 without modifications.

Better Tools

  • The REPL will have command completion, in addition to other enhancements.
  • They have greatly improved the IDE and compiler interface. Miles Sabin and Iulian Dragos worked on this with Martin. There is limited and somewhat unstable support in Eclipse now.

New Control Abstractions

Several new control abstractions are being introduced.

  • Continuations will be supported with a compiler plugin.
  • Scala has not had the break keyword. It will now exist, but as a library method.
  • Scala will optimize trampolining tail calls (e.g., foo1 tail calls foo2, which tail calls foo1, and back and forth).

More features

  • The Swing wrapper library has been enhanced.
  • The performance has been improved in several ways.
    • Structural type dispatch
    • Actors
    • Vectors, sets, and maps. Their long-term goal is to implement the fastest ones available for the JVM.

These changes are not yet in the trunk.

Beyond 2.8

Longer term, they plan significant improvements in support for parallelism and concurrency, including new concurrency models besides actors, such as:
  • Transactions (STM)
  • Data parallelism
  • stream processing

Clojure is influencing this. Martin praised the competition ;) Fortunately, the original designer of the data structures and algorithms used heavily by Clojure is working on Scala versions. (Name?)

Doug Lea wants to work with the team on concurrency data structures. The lack of closures in Java makes this effort difficult in Java.

There is some exciting work in advanced type system support for guaranteeing actor isolation and effect tracking. For example, this technology wouuld allow actors to exchange references to big objects without copying them while ensuring that they aren’t modified concurrently.

On a final note, Bill Wake described a conversation he had with Joshua Bloch today who admitted that the time has arrived for him to look seriously at Scala. A possible endorsement from Joshua Bloch would be a major step for Scala.

Is the Supremacy of Object-Oriented Programming Over? 232

Posted by Dean Wampler Tue, 21 Apr 2009 02:45:00 GMT

I never expected to see this. When I started my career, Object-Oriented Programming (OOP) was going mainstream. For many problems, it was and still is a natural way to modularize an application. It grew to (mostly) rule the world. Now it seems that the supremacy of objects may be coming to an end, of sorts.

I say this because of recent trends in our industry and my hands-on experience with many enterprise and Internet applications, mostly at client sites. You might be thinking that I’m referring to the mainstream breakout of Functional Programming (FP), which is happening right now. The killer app for FP is concurrency. We’ve all heard that more and more applications must be concurrent these days (which doesn’t necessarily mean multithreaded). When we remove side effects from functions and disallow mutable variables, our concurrency issues largely go away. The success of the Actor model of concurrency, as used to great effect in Erlang, is one example of a functional-style approach. The rise of map-reduce computations is another example of a functional technique going mainstream. A related phenomenon is the emergence of key-value store databases, like BigTable and CouchDB, is a reaction to the overhead of SQL databases, when the performance cost of the Relational Model isn’t justified. These databases are typically managed with functional techniques, like map-reduce.

But actually, I’m thinking of something else. Hybrid languages like Scala, F#, and OCaml have demonstrated that OOP and FP can complement each other. In a given context, they let you use the idioms that make the most sense for your particular needs. For example, immutable “objects” and functional-style pattern matching is a killer combination.

What’s really got me thinking that objects are losing their supremacy is a very mundane problem. It’s a problem that isn’t new, but like concurrency, it just seems to grow worse and worse.

The problem is that there is never a stable, clear object model in applications any more. What constitutes a BankAccount or Customer or whatever is fluid. It changes with each iteration. It’s different from one subsystem to another even within the same iteration! I see a lot of misfit object models that try to be all things to all people, so they are bloated and the teams that own them can’t be agile. The other extreme is “balkanization”, where each subsystem has its own model. We tend to think the latter case is bad. However, is lean and mean, but non-standard, worse than bloated, yet standardized?

The fact is, for a lot of these applications, it’s just data. The ceremony of object wrappers doesn’t carry its weight. Just put the data in a hash map (or a list if you don’t need the bits “labeled”) and then process the collection with your iterate, map, and reduce functions. This may sound heretical, but how much Java code could you delete today if you replaced it with a stored procedure?

These alternatives won’t work for all situations, of course. Sometimes polymorphism carries its weight. Unfortunately, it’s too tempting to use objects as if more is always better, like cow bell.

So what would replace objects for supremacy? Well, my point is really that there is no one true way. We’ve led ourselves down the wrong path. Or, to be more precise, we followed a single, very good path, but we didn’t know when to take a different path.

Increasingly, the best, most nimble designs I see use objects with a light touch; shallow hierarchies, small objects that try to obey the Single Responsibility Principle, composition rather than inheritance, etc. Coupled with a liberal use of functional idioms (like iterate, map, and reduce), these designs strike the right balance between the protection of data hiding vs. openness for easy processing. By the way, you can build these designs in almost any of our popular languages. Some languages make this easier than others, of course.

Despite the hype, I think Domain-Specific Languages (DSLs) are also very important and worth mentioning in this context. (Language-Oriented Programming – LOP – generalizes these ideas). It’s true that people drink the DSL Kool-Aid and create a mess. However, when used appropriately, DSLs reduce a program to its essential complexity, while hiding and modularizing the accidental complexity of the implementation. When it becomes easy to write a user story in code, we won’t obsess as much over the details of a BankAccount as they change from one story to another. We will embrace more flexible data persistence models, too.

Back to OOP and FP, I see the potential for their combination to lead to a rebirth of the old vision of software components, but that’s a topic for another blog post.

Pat Eyler Interviews Dean Wampler and Alex Payne on "Programming Scala". 54

Posted by Dean Wampler Tue, 17 Mar 2009 17:48:00 GMT

Pat Eyler posted an interview with Alex Payne and me (Dean Wampler), which we conducted over email. We dish on Scala, Functional Programming, and our forthcoming book Programming Scala.

Tighter Ruby Methods with Functional-style Pattern Matching, Using the Case Gem 148

Posted by Dean Wampler Tue, 17 Mar 2009 00:59:00 GMT

Ruby doesn’t have overloaded methods, which are methods with the same name, but different signatures when you consider the argument lists and return values. This would be somewhat challenging to support in a dynamic language with very flexible options for method argument handling.

You can “simulate” overloading by parsing the argument list and taking different paths of execution based on the structure you find. This post discusses how pattern matching, a hallmark of functional programming, gives you powerful options.

First, let’s look at a typical example that handles the arguments in an ad hoc fashion. Consider the following Person class. You can pass three arguments to the initializer, the first_name, the last_name, and the age. Or, you can pass a hash using the keys :first_name, :last_name, and :age.


require "rubygems" 
require "spec" 

class Person
  attr_reader :first_name, :last_name, :age
  def initialize *args
    arg = args[0]
    if arg.kind_of? Hash       # 1
      @first_name = arg[:first_name]
      @last_name  = arg[:last_name]
      @age        = arg[:age]
    else
      @first_name = args[0]
      @last_name  = args[1]
      @age        = args[2]
    end
  end
end

describe "Person#initialize" do 
  it "should accept a hash with key-value pairs for the attributes" do
    person = Person.new :first_name => "Dean", :last_name => "Wampler", :age => 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
  it "should accept a first name, last name, and age arguments" do
    person = Person.new "Dean", "Wampler", 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
end

The condition on the # 1 comment line checks to see if the first argument is a Hash. If so, the attribute’s values are extracted from it. Otherwise, it is assumed that three arguments were specified in a particular order. They are passed to #initialize in a three-element array. The two rspec examples exercise these behaviors. For simplicity, we ignore some more general cases, as well as error handling.

Another approach that is more flexible is to use duck typing, instead. For example, we could replace the line with the # 1 comment with this line:


if arg.respond_to? :has_key?

There aren’t many objects that respond to #has_key?, so we’re highly confident that we can use [symbol] to extract the values from the hash.

This implementation is fairly straightforward. You’ve probably written code like this yourself. However, it could get complicated for more involved cases.

Pattern Matching, a Functional Programming Approach

Most programming languages today have switch or case statements of some sort and most have support for regular expression matching. However, in functional programming languages, pattern matching is so important and pervasive that these languages offer very powerful and convenient support for pattern matching.

Fortunately, we can get powerful pattern matching, typical of functional languages, in Ruby using the Case gem that is part of the MenTaLguY’s Omnibus Concurrency library. Omnibus provides support for the hot Actor model of concurrency, which Erlang has made famous. However, it would be a shame to restrict the use of the Case gem to parsing Actor messages. It’s much more general purpose than that.

Let’s rework our example using the Case gem.


require "rubygems" 
require "spec" 
require "case" 

class Person
  attr_reader :first_name, :last_name, :age
  def initialize *args
    case args
    when Case[Hash]       # 1
      arg = args[0]
      @first_name = arg[:first_name]
      @last_name  = arg[:last_name]
      @age        = arg[:age]
    else
      @first_name = args[0]
      @last_name  = args[1]
      @age        = args[2]
    end
  end
end

describe "Person#initialize" do 
  it "should accept a first name, last name, and age arguments" do
    person = Person.new "Dean", "Wampler", 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
  it "should accept a has with :first_name => fn, :last_name => ln, and :age => age" do
    person = Person.new :first_name => "Dean", :last_name => "Wampler", :age => 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
end

We require the case gem, which puts the #=== method on steroids. In the when statement in #initialize, the expression when Case[Hash] matches on a one-element array where the element is a Hash. We extract the key-value pairs as before. The else clause assumes we have an array for the arguments.

So far, this is isn’t very impressive, but all we did was to reproduce the original behavior. Let’s extend the example to really exploit some of the neat features of the Case gem’s pattern matching. First, let’s narrow the allowed array values.


require "rubygems" 
require "spec" 
require "case" 

class Person
  attr_reader :first_name, :last_name, :age
  def initialize *args
    case args
    when Case[Hash]       # 1
      arg = args[0]
      @first_name = arg[:first_name]
      @last_name  = arg[:last_name]
      @age        = arg[:age]
    when Case[String, String, Integer]
      @first_name = args[0]
      @last_name  = args[1]
      @age        = args[2]
    else
      raise "Invalid arguments: #{args}" 
    end
  end
end

describe "Person#initialize" do 
  it "should accept a first name, last name, and age arguments" do
    person = Person.new "Dean", "Wampler", 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
  it "should accept a has with :first_name => fn, :last_name => ln, and :age => age" do
    person = Person.new :first_name => "Dean", :last_name => "Wampler", :age => 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
  it "should not accept an array unless it is a [String, String, Integer]" do
    lambda { person = Person.new "Dean", "Wampler", "39" }.should raise_error(Exception)
  end
end

The new expression when Case[String, String, Integer] only matches a three-element array where the first two arguments are strings and the third argument is an integer, which are the types we want. If you use an array with a different number of arguments or the arguments have different types, this when clause won’t match. Instead, you’ll get the default else clause, which raises an exception. We added another rspec example to test this condition, where the user’s age was specified as a string instead of as an integer. Of course, you could decide to attempt a conversion of this argument, to make your code more “forgiving” of user mistakes.

Similarly, what happens if the method supports default values some of the parameters. As written, we can’t support that option, but let’s look at a slight variation of Person#initialize, where a hash of values is not supported, to see what would happen.


require "rubygems" 
require "spec" 
require "case" 

class Person
  attr_reader :first_name, :last_name, :age
  def initialize first_name = "Bob", last_name = "Martin", age = 29
    case [first_name, last_name, age]
    when Case[String, String, Integer]
      @first_name = first_name
      @last_name  = last_name
      @age        = age
    else
      raise "Invalid arguments: #{first_name}, #{last_name}, #{age}" 
    end
  end
end

def check person, expected_fn, expected_ln, expected_age
  person.first_name.should == expected_fn
  person.last_name.should  == expected_ln
  person.age.should        == expected_age
end

describe "Person#initialize" do 
  it "should require a first name (string), last name (string), and age (integer) arguments" do
    person = Person.new "Dean", "Wampler", 39
    check person, "Dean", "Wampler", 39
  end
  it "should accept the defaults for all parameters" do
    person = Person.new
    check person, "Bob", "Martin", 29
  end
  it "should accept the defaults for the last name and age parameters" do
    person = Person.new "Dean" 
    check person, "Dean", "Martin", 29
  end
  it "should accept the defaults for the age parameter" do
    person = Person.new "Dean", "Wampler" 
    check person, "Dean", "Wampler", 29
  end
  it "should not accept the first name as a symbol" do
    lambda { person = Person.new :Dean, "Wampler", "39" }.should raise_error(Exception)
  end
  it "should not accept the last name as a symbol" do
  end
  it "should not accept the age as a string" do
    lambda { person = Person.new "Dean", "Wampler", "39" }.should raise_error(Exception)
  end
end

We match on all three arguments as an array, asserting they are of the correct type. As you might expect, #initialize always gets three parameters passed to it, including when default values are used.

Let’s return to our original example, where the object can be constructed with a hash or a list of arguments. There are two more things (at least …) that we can do. First, we’re not yet validating the types of the values in the hash. Second, we can use the Case gem to impose constraints on the values, such as requiring non-empty name strings and a positive age.


require "rubygems" 
require "spec" 
require "case" 

class Person
  attr_reader :first_name, :last_name, :age
  def initialize *args
    case args
    when Case[Hash]
      arg = args[0]
      @first_name = arg[:first_name]
      @last_name  = arg[:last_name]
      @age        = arg[:age]
    when Case[String, String, Integer]
      @first_name = args[0]
      @last_name  = args[1]
      @age        = args[2]
    else
      raise "Invalid arguments: #{args}" 
    end
    validate_name @first_name, "first_name" 
    validate_name @last_name, "last_name" 
    validate_age
  end

  protected

  def validate_name name, field_name
    case name
    when Case::All[String, Case.guard {|s| s.length > 0 }]
    else
      raise "Invalid #{field_name}: #{first_name}" 
    end
  end

  def validate_age
    case @age
    when Case::All[Integer, Case.guard {|n| n > 0 }]
    else
      raise "Invalid age: #{@age}" 
    end
  end
end

describe "Person#initialize" do 
  it "should accept a first name, last name, and age arguments" do
    person = Person.new "Dean", "Wampler", 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
  it "should accept a has with :first_name => fn, :last_name => ln, and :age => age" do
    person = Person.new :first_name => "Dean", :last_name => "Wampler", :age => 39
    person.first_name.should == "Dean" 
    person.last_name.should  == "Wampler" 
    person.age.should        == 39
  end
  it "should not accept an array unless it is a [String, String, Integer]" do
    lambda { person = Person.new "Dean", "Wampler", "39" }.should raise_error(Exception)
  end
  it "should not accept a first name that is a zero-length string" do
    lambda { person = Person.new "", "Wampler", 39 }.should raise_error(Exception)
  end    
  it "should not accept a first name that is not a string" do
    lambda { person = Person.new :Dean, "Wampler", 39 }.should raise_error(Exception)
  end    
  it "should not accept a last name that is a zero-length string" do
    lambda { person = Person.new "Dean", "", 39 }.should raise_error(Exception)
  end    
  it "should not accept a last name that is not a string" do
    lambda { person = Person.new :Dean, :Wampler, 39 }.should raise_error(Exception)
  end    
  it "should not accept an age that is less than or equal to zero" do
    lambda { person = Person.new "Dean", "Wampler", -1 }.should raise_error(Exception)
    lambda { person = Person.new "Dean", "Wampler", 0 }.should raise_error(Exception)
  end    
  it "should not accept an age that is not an integer" do
    lambda { person = Person.new :Dean, :Wampler, "39" }.should raise_error(Exception)
  end    
end

We have added validate_name and validate_age methods that are invoked at the end of #initialize. In validate_name, the one when clause requires “all” the conditions to be true, that the name is a string and that it has a non-zero length. Similarly, validate_age has a when clause that requires age to be a positive integer.

Final Thoughts

So, how valuable is this? The code is certainly longer, but it specifies and enforces expected behavior more precisely. The rspec examples verify the enforcement. It smells a little of static typing, which is good or bad, depending on your point of view. ;)

Personally, I think the conditional checks are a good way to add robustness in small ways to libraries that will grow and evolve for a long time. The checks document the required behavior for code readers, like new team members, but of course, they should really get that information from the tests. ;) (However, it would be nice to extract the information into the rdocs.)

For small, short-lived projects, I might not worry about the conditional checks as much (but how many times have those “short-lived projects” refused to die?).

You can read more about Omnibus and Case in this InfoQ interview with MenTaLguY. I didn’t discuss using the Actor model of concurrency, for which these gems were designed. For an example of Actors using Omnibus, see my Better Ruby through Functional Programming presentation or the Confreak’s video of an earlier version of the presentation I gave at last year’s RubyConf.

1st Ever Chicago Area Scala Enthusiasts (CASE) Meeting Tonight 74

Posted by Dean Wampler Fri, 20 Feb 2009 00:17:00 GMT

Tonight is our first meeting, at the ThoughtWorks offices in the Aon building downtown. If you’re going and you haven’t RSVP’ed, either send a tweet to @chicagoscala or reply here ASAP!

Hope to see you there. Our meetings will be the 3rd Thursday of each month.

Organizing a Chicago Area Scala Enthusiasts (CASE) Group 261

Posted by Dean Wampler Sat, 17 Jan 2009 23:02:00 GMT

I’m organizing a group in Chicago for people interested in Scala, called the Chicago Area Scala Enthusiasts (CASE). If you’re interested, join the google group for more information.

Older posts: 1 2 3 ... 5