I love the 90's: The Fusion Episode 15

Posted by Brett Schuchert Wed, 02 Jul 2008 20:55:00 GMT

A few weeks back I was working with a team on the East Coast. They wanted to develop a simulator to assist in testing other software components. Their system to simulate is well-described in a specification using diagrams close to sequence diagrams as described in the UML.

In fact, these diagrams were of a variety I’d call “system” sequence diagrams. They described the interaction between outside entities (actors – in this case another system) and the system to be simulated.

This brought be back to 1993 when I was introduced to The Fusion Method by Coleman et al. Before that I had read Booch (1 and 2) and Rumbaugh (OMT) and I honestly didn’t follow much of their material – I had book knowledge but I really didn’t practice it. I always thought that Booch was especially strong in Design ideas and notation but weak in Analysis. I though the opposite for Rumbaugh, so the two together + Jacobson with Use Cases and Business Modeling really formed a great team in terms of covering the kinds of things you need to cover in a thorough software process (UP + UML).

But before all that was Fusion.

Several colleagues and I really groked Fusion. It started with system sequence diagrams showing interactions much like the specification I mentioned above. It also described a difference between analysis and design (and if Uncle Bob reads this, he’ll probably have some strong words about so-called object-oriented analysis, well this was 15 years ago… though I still there there is some value to be found there). Anyway, this is mostly about system sequence diagrams so I won’t say much more about that excellent process.

Each system sequence diagram represented a scenario. To represent many scenarios, Fusion offered a BNF-based syntax to express those various scenarios. (I understand that this was also because for political reasons within HP they were not allowed to include a state model, but I don’t know if that is true or not.) For several years I practiced Fusion and really I often revert back to that if I’m not trying to do anything in particular.

Spending a little time up front thinking a little about the logical interaction between the system and its various actors helps me get a big picture view of the system boundary and its general flow. I have also found it helps others as well, but your mileage may vary.

So when I viewed their protocol specification, it really brought back some good memories. And in fact, that’s how we decided to look at the problem.

(What follows is highly-idealized)

We reviewed their specification and decided we’d try to work through the initialization sequence and then work through one sequence that involved “completing a simple job.” I need to keep this high level to keep the identity of the company a secret.

There was prior work and we kept that in mind but really started from scratch. In our very first attempt, there had been some work done along the lines of using the Command pattern, so we started there. Of course, once we did our first command, we backed off and when with a more basic design that seemed to fit the complexity a bit better (starting with the command pattern at the beginning is an example of solution-problemming to use a Weinberg term – and one of the reasons I’m sometimes skeptical when people start talking in patterns).

We continued working from the request coming into the system and working its way through the system. Along the way, we wrote unit tests, driven by our end goal of trying to complete a simple job and guided by the single responsibility principle. As we thought about the system, there were several logical steps:
  • Receive a message from the outside as some array of bytes
  • Determine the “command” represented by the bytes
  • Process the parameters within the command
  • Issue a message to the simulator
  • Create a logical response
  • Format the logical response into the underlying protocol
  • Send the response back

At the time, they were considering using JNI, so we spent just over a day validating that we could communicate bi-directionally, maintaining a single process space.

Along the way we moved from using hand-rolled test doubles to using JMock 2 to create mock objects. I mentioned this to friend of mine who lamented that there are several issues using a mock-based approach:
  • It is easy to end up with a bunch of tested objects but no fully-connected system
  • Sharing setup between various mocks is difficult and often not done so there’s a lot of violation of DRY
  • You have to learn a new syntax

We accepted learning a new syntax because it was deemed less painful than maintaining existing hand-rolled test doubles (though there are several reasonable solution for that, ask if you want to know what it is). There is the issue of sharing setup on mocks, but we did not have enough work yet to really notice that as a problem. However, they were at least aware of that and we briefly discussed how to share common expectation-setting (it’s well supported).

Finally, there’s the issue of not having a fully connected system. We knew this was an issue so we started by writing an integration test using JUnit. We needed to design a system that:
  • Could be tested up to but excluding the JNI stuff
  • Could be configured to stub out JNI or use real JNI
  • Was easily configurable
  • Was automatically configured by C++ (since it was a C++ process that was started to get the whole system in place)

We designed that (15 minute white-board session), coded it and ended up with a few integration tests. Along the way, we built a simple factory for creating the system fully connected. That factory was used both in tests as well as by the JNI-based classes to make sure that we had a fully-connected systems when it was finally started by C++.

Near the end, we decided we wanted to demonstrate asynchronous computation, which we did using tests. I stumbled a bit but we got it done in a few hours. We demonstrated that the system receiving messages from the outside world basically queued up requests rather than making the sender wait synchronously (we demonstrated this indirectly – that might be a later blog post – let me know if you’re interested).

By the way, that was the first week. These guys were good and I had a great time.

There was still a little work to be done on the C++ side and I only had a week, so I asked them to keep me posted. The following Tuesday they had the first end-to-end interaction, system initialization.

By Wednesday (so 3 business days later), they had a complete demonstration of end-to-end interaction with a single, simple job finishing. Not long after that they demonstrated several simple jobs finishing. The next thing on their list? Completing more complex jobs, system configuration, etc.

However, it all goes back to having a well-defined protocol. After we had one system interaction described end-to-end, doing the next thing was easier:
  • Select a system interaction
  • List all of the steps it needs to accomplish (some requests required a response, some did not)
  • Write unit tests for each “arm” of the interaction
So they had a very natural way to form the backlog:
Select a set of end-to-end interactions that add value to the user of the system
They also had an easy way to create a sprint backlog:
For each system-level interaction, enumerate all of its steps and then add implementing those steps as individual back-log items

Now some of those individual steps will end up being small (less than an hour) but some will be quite large when they start working with variable parameters and commands that need to operate at a higher priority.

But they are well on their way and I was reminded of just how much I really enjoyed using Fusion.

Comments

Leave a response

  1. Avatar
    Yanic about 15 hours later:

    Using a ‘sequence diagram’-heavy (or equivalent) process in that period sounds nothing short of heroic :o)

    I know which one to use nowadays, but what tool did you use for the diagrams back then? Was it productive?

  2. Avatar
    Brett L. Schuchert about 16 hours later:
    I used several tools:
    • Paper and pencil
    • wet-erase boards
    • Visio (mostly this)

    I still use Visio today. I know I used Visio in 1993, maybe earlier. I remember Craig Larman (yes that Craig Larman) and I working through a diagram that showed the course flow for our C++ and Smalltalk classes. We used the drawing to try and get the classes as close as possible.

    As for the tool you linked to, I’m leery of using any tool other than hand-drawn or Visio (NOT using their stencils). I don’t draw them very often and when I do, I like the fact that the Visio stencils I use don’t require the drawing to be UML-semantically correct.

  3. Avatar
    Yanic about 18 hours later:

    Visio and by hand, heroic indeed!

    However, that would devote much of your time & energy to the graphical aspect instead of the content.

    Maybe you should give Trace Modeler a try and see what a difference 15 years have made. Or watch this 30 sec demo.

    I didn’t know Fusion put so much emphasis on modelling scenarios, IMHO that sounds like the right thing to do!

    I think the current OO methods put too much focus on the static structure (i.e. OOD = “doing class diagrams”) and basically boil down to data modelling. Nothing’s wrong with data modelling if your app is mostly data shuffling anyway, but I think it’s a pity the distinction isn’t made.

    The same goes for ‘object oriented’ analysis, but perhaps that was Bob’s point you were alluding to? (Btw, do you have any references to his take on OOA?)

    IMHO the static structure should mostly be the result of whatever dynamics are decided upon. Sounds like you and Craig got it right back then!

  4. Avatar
    phynix 3 days later:

    ?,???,??~

  5. Avatar
    Brett L. Schuchert 3 days later:

    It’s been a long time since I actually had a look at Fusion. The strong emphasis on scenarios might have been cultural (this was when I worked at a now defunct company called Object Space).

    As for references to Bob’s take on OOA, no references. He’s not a strong believer in OOA. I sort of agree. There’s analysis and it has nothing OO (or non OO) about it. Even so, I think there can be some value in domain analysis but that is a very long discussion indeed.

    In a nutshell, people who know their domain would get a lot of value out of domain analysis because in my experience “knowing the domain” is equivalent to “understanding the current system implementation” in those situations, and that’s unfortunate.

    Since I work as a consultant and a generalist at that, domain analysis helps me to better understand the domain I’m working in.

    I took a look at that 30 second demo. It looks like a good tool. And I agree that it might seem like I could spend a lot of time in the graphical aspect of creating the diagrams. In fact, often I did just that (don’t do so anymore).

    On the other hand, I can draw things quickly (usually in minutes) and the act of using Visio is sometimes like a second review. I take the time to draw things out but I usually do things in steps:

    First, I’ll create the basic diagram and message flow (generally with no parameters).

    Then I’ll start detailing out the information I want to send back and forth.

    Then, if it’s important, I’ll add in activation boxes, and so on. So as I work through them I add layer upon layer of information, thinking deeper and deeper.

    So it’s not so much a drawing as a means to organize my thoughts.

    I can draw them in Visio as fast (or faster) than I can draw them in hand. But I don’t generally create hi-fidelity sequence diagrams.

  6. Avatar
    Stephen G. Edwards 15 days later:

    Wow, I thought I was the only person in the world who was into Fusion back then!

    At the time, I was disgusted with the way the Booch-Rumbaugh-Coad juggernaut approached analysis. I don’t have a problem with using classes and relationships to model domain elements. I think this can be quite intuitive.

    But problems arise when you start to inject behavior into the model. If you adhere strictly to the OO analogy, you are forced to use messaging. This means that you are making design decisions at analysis time. Objects and concepts in the real world do have relationships, but typically do not talk to each other, so it’s a completely unnatural thing to do.

    I thought that Fusion’s system state modeling approach was brilliant in that it made excellent use of the domain model while avoiding the trap of early design. It was a breath of fresh air.

    I still use this approach, since I think it is very useful for elaborating use cases. But I tend to keep it strictly in the background as “scratch” work that is not shown to clients (or even most other developers). I think it leaves most people cold, because when they see class diagrams, they also expect to see messaging.

    I was in contact with the authors at the time, who seemed eager for my feedback from my use of Fusion on a pilot project. Unfortunately, I was soon overwhelmed by other responsibilities and unable to collaborate. Sorry, Derek et al!

  7. Avatar
    ipad video converter about 1 year later:

    you can have a dand it is reqally good

  8. Avatar
    http://www.blacktowhiteiphone4.com over 2 years later:

    News about Apple’s much-delayed white iPhone 4 continues to fly fast and furious, with a new claim from Boy Genius Report suggesting that the model has been canceled entirely instead of merely delayed until next spring as the company announced yesterday.

  9. Avatar
    Criminal Records over 3 years later:

    That factory was used both in tests as well as by the JNI-based classes to make sure that we had a fully-connected systems when it was finally started by C++.

  10. Avatar
    mosic over 3 years later:

    I am really appreacite!

  11. Avatar
    okey oyunu oyna over 3 years later:

    thkx

    internette görüntülü olarak okey oyunu oyna, gerçek kisilerle tanis, turnuva heyecanini yasa.

  12. Avatar
    Discount Sunglasses Outlet over 3 years later:

    Being the important carrier of Logos, science and technology have become the main content of human culture.

  13. Avatar
    DR OZ african Mango over 3 years later:

    hmm ,i’m not sure if this is what i’m looking for but anyway this is interresting and could be useful some day,thanks for taking time to write such cool stuff

  14. Avatar
    ysbearing over 3 years later:

    Slewing bearing called slewing ring bearings, is a comprehensive load to bear a large bearing, can bear large axial, radial load and overturning moment.

  15. Avatar
    tn pas che over 4 years later:

    .Slewing bearing called slewing ring bearings, is a comprehensive load to bear a large bearing, can bear large axial, radial load and overturning moment.org

Comments