I think people's insistence on Ruby being a worthwhile thing is based on its expressiveness when defining the behavior of an algorithm. Personally, I don't see how the Ruby code in that example is any more expressive than the C# code but that's not the point. For the sake of argument, I will grant that it somehow might be for some people.
It still doesn't matter. People are already able to write and understand very complex algorithms. Making them easier to implement is not a good thing. For one thing, it optimizes the easiest part of what we do. For another, it enables programmers with outdated skills sets by allowing them to do what they are comfortable doing: writing "programs" and wrapping them up in objects.
Defining algorithms is not what good programmers do. What good programmers to is encapsulate variation to such an extent that there is no need to define a complex algorithm. This is very difficult as it requires us to think about things in terms of entities with responsibilities rather than in terms of sequences of steps, some of which are conditionally executed.
So I think that Ruby is a bit of a golden hammer: It's optimized for how it looks rather than how it functions - it focuses on letting people indulge their bad habits rather than providing an incentive to correct their behaviors.
That is... unless there is some way it enables people to define abstractions and encapsulate variation better than other languages. If there is, please send me a link to an example demonstrating.