The sound mirrors of Lade Pit, Dungeness

Introduction

The Sounds Mirrors of Lade Pit are the result of a programme to develop an
aircraft early warning system before the advent of radar. I first heard about
them over ten years ago when I read an article in the Sunday Telegraph
magazine on Dungeness. I was intrigued and wanted to go and see them but never
got around to doing anything about it until last year. The sound mirrors are
on private land and not normally accessible to the public. On 17th August 2008
I attended a walk led by Dr Scarth, apparently the world’s expert on sound
mirrors, who told the fascinating story behind them.

Although the mirrors were able to detect aircraft with varying degrees of
success the research programme apparently had a lot of difficulty with noise
interference. Another problem was that as aircraft speed increased the mirrors
were no longer able to give an adequate advance warning. Finally, the research
programme that resulted in radar began to achieve better results and interest
in the sound mirrors evaporated.

This is the view from the entrance to the land owned by CEMEX, only the 200’
mirror is visible. The lake is caused by gravel extraction and did not exist
at the time the mirrors were built.

\(JPG\)

Mirror lake

This shows, from left to right, the 200 foot mirror, the 20 foot mirror and
the 30 foot mirror.

\(JPG\)

3 mirrors

Dr. Scarth gave a short talk on the background behind the sound mirrors before
leading us towards the island where they are located.

\(JPG\)

Dr. Scarth

In order to protect the sound mirrors from further vandalism access to the
island is only possible via a swing bridge that is normally locked in the open
position. A metal plaque is mounted on this bridge.

\(JPG\)

Plaque

The text on this plaque reads:

The three concrete structures visible from this bridge are Sound Mirrors or
Listening Devices. They were built around 1930 for experiments to detect enemy
aircraft. The concave shape of the mirror collected and amplified sound from
aircraft engines and could also indicate from which direction the noise was
coming. The structures were intended to provide air raid warnings and to
assist interception of an attack. This system was never fully operational and
was made obsolete by methods of detection based on radio waves there were
introduced from the late 1930s. This system became radar.

A Government Acoustic Research Station was established at Hythe in 1922. The
first mirror built here at Denge in 1928 was part of a chain linked to this
site. Subsequent experiments with the size and shape of mirrors let to the
construction of the other two structures here by 1930. These mirrors are thus
a unique group demonstrating the different mirror types. When the structures
were built the lakes formed by gravel extraction did not exist and the
experimental mirrors stood in isolation on the Dungeness shingle, served by a
narrow gauge railway. The Sound Mirrors were located in isolated places to
avoid extraneous noise, and experimental target aircraft were provided by
commercial flights over the channel, and also by RAF aircraft during Air
Defense of Great Britain exercises in the 1930s.

The structures are a scheduled monument. They were repaired in 2003 by
English Heritage with financial support provided by the Aggregates Levy
Sustainability Fund through Defra and from the EU’s Interreg III programme
(the Historic Fortifications Network co-ordinated by Kent County Council).
Valuable assistance was provided by the quarry operator, CEMEX. Access is not
possible to the island containing the three structures. This is to as to
protect the structures but also to safeguard the bird populations using the
lake.

The right hand structure is the 30-foot mirror (the measurement is the
diameter), completed in 1930. Its bowl construction is unique. Beneath this
can be seen the chamber in which the operator sat, listening to sound
collected from the mirror surface by rotating the still visible detector and
conveyed to the ears by an instrument resembling a stethoscope. The smallest
structure in the centre of the group is the 20-foot mirror of 1928. It
originally had a concrete pillar in front of the mirror surface, supporting a
movable sound detector. The largest structure is the 200-foot miror of 1930
(referring to its length). The curved wall concentrated sound waves that were
detected by a row of microphones positioned on the concrete forecourt of the
structure. Only one other mirror of this design exists, at Malta.

There were, apparently, 261 people who attended this walk. The largest group
in the 11 years Dr. Scarth has been giving these talks.

\(JPG\)

Crowd

20’ mirror

The 20’ mirror is the oldest, built in 1928, and also the least visually
interesting. According to the plaque there was originally a concrete pillar in
front of it on which was mounted the sound detection apparatus but I didn’t
notice any sign of it.

\(JPG\)

20’ mirror front

\(JPG\)

20’ mirror side

\(JPG\)

30’ and 20’ mirror backs

30’ mirror

The 30’ mirror, built in 1930, is also the most well preserved. The operator
would sit in a cabin directly under the dish wearing an apparatus similar to a
stethoscope. The black pole with the oddly shaped tube (“trumpet”) on top is
the remains of the sound detection apparatus. Originally there were mechanical
linkages allowing the pole to be rotated and the “trumpet” to be rotated - the
operator would move the trumpet in order to get the strongest signal and the
position of the trumpet would give an indication of the direction the aircraft
was approaching from.

\(JPG\)

30’ mirror

\(JPG\)

30’ mirror 2

The box under the dish is the remains of the operator cabin. The wood and
glass forming the rest of the cabin has long since gone.

\(JPG\)

30’ mirror front side

\(JPG\)

30’ mirror side

You can see that ground level has lowered since the mirrors were constructed
(due to gravel extraction?) since the top of these stairs would originally
have been at ground level.

\(JPG\)

30’ mirror control room stairs 1

Stairs at the back of the 30’ mirror leading down to the control room

\(JPG\)

30’ mirror control room stairs 2

200’ mirror

While the 20’ and 30’ mirrors were purely acoustic, relying on tubes to
conduct the sound collected by the mirror to the operator the 200’ mirror used
an array of microphones placed in a forecourt in front of the mirror. Another
difference between the 200’ mirror and the others is that the smaller mirrors
relied on a steerable detector to find the direction from which the aircraft
were approaching, however the 200’ mirror found the direction of approach by
determining which of the microphones were giving the strongest signal.

Unfortunately a large amount of the front of the 200’ mirror has been reduced
to rubble. Apparently, the operator of the gravel pit extracted too much
gravel and undermined the front of the mirror causing it to collapse.

\(JPG\)

View along 200’ mirror

\(JPG\)

200’ mirror front 1

\(JPG\)

200’ mirror front 2

\(JPG\)

200’ mirror front right

This shows the 200’ mirror from the front. The stairs in the foreground would
originally have been at ground level. Unlike the other mirrors on this site
the 200’ mirror used microphones placed along a wall in front (I think along
the channel visible running from the stairs along the front of the mirror).

\(JPG\)

200’ mirror stairs

\(JPG\)

200’ mirror back

This image shows the remains of the 200’ mirror control room, located at the
back of the mirror. The rectangular hole would have originally contained glass
and was used to allow the operators to see in front of the mirror.

\(JPG\)

200’ mirror control room

More information

More photos of these sound mirrors and others around the country can be found
here on this website by Andrew
Grantham

Dr. Scarth also published two books on the sound mirrors (now out of print):
Mirrors by the Sea: Account of the Hythe Sound Mirror System Based on
Contemporary Letters and Reports
“ by
Richard Newton Scarth, published by Hythe Civic Society (April 1995). Also
Echoes from the Sky: A Story of Acoustic Defence,”
Hythe Civic Society (Sep 1999).

View the discussion
thread.

Java Architecture for XML Binding (JAXB)

So, I was working on a little program to make it easier to configure collections of
servos making up the
robot I’m working on when I got to the point of figuring
out how I was going to persist this information. Although the sum total of
configuration is trivial at the moment I have plans to eventually include
enough structural information about the robot that a simple textual list of
properties would become unwieldy. An XML format seemed like the obvious
choice.

I wrote a simple Relax NG schema and was just about to
write yet another SAX based parser and Java
Writer
when I
had this sense that I’d been in this situation way too many times before and
that there had to be a better way. Processing an in-memory
DOM tree and then using
Java’s XML processing to read/write it didn’t sound like a whole lot of fun so
in the end I thought I’d give JAXB a go. I’d
heard about it years ago (it’s not a new Java technology) but not had an
excuse to use it before so now seemed like a good time to give it a go.

There are several tutorials and articles about JAXB out there - unfortunately
they all seem to cover different versions. JAXB 2.x seems much less hassle to
use than JAXB 1.x. Also, while the current version of JAXB is 2.1, the version
of JAXB bundled with the Java 1.6 SDK is 2.0 and the bundled version doesn’t
have the ant task. You can download the current version of JAXB from
jaxb.dev.java.net

In order to prevent examples getting too long we’ll stay with a fragment of
the complete data structure which consists of servos and a Robot which is
simply a collection of servos.

The impression I got from reading the tutorials and documentation is that you
need to start with a schema and generate Java classes from there. I’m not a
great fan of XML Schema so I decided to stick with my Relax NG schema:

`
<?xml version=”1.0” encoding=”UTF-8”?>




























`

I then used Oxygen XML‘s built-in copy of
trang to convert this to XML Schema:

<?xml version="1.0" encoding="UTF-8"?> <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified"> <xs:element name="robot"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" maxOccurs="unbounded" ref="servo"/> </xs:sequence> <xs:attributeGroup ref="identifiable-element"/> </xs:complexType> </xs:element> <xs:attributeGroup name="identifiable-element"> <xs:attribute name="id" use="required"/> <xs:attribute name="name" use="required"/> </xs:attributeGroup> <xs:element name="servo"> <xs:complexType> <xs:attributeGroup ref="identifiable-element"/> <xs:attribute name="min_angle" default="-1.570796327" type="xs:decimal"/> <xs:attribute name="max_angle" default="1.570796327" type="xs:decimal"/> <xs:attribute name="rest_angle" default="0" type="xs:decimal"/> <xs:attribute name="min_pulse_width" default="500" type="xs:integer"/> <xs:attribute name="max_pulse_width" default="2500" type="xs:integer"/> <xs:attribute name="max_speed" default="2500" type="xs:integer"/> <xs:attribute name="desired_position" default="0" type="xs:integer"/> </xs:complexType> </xs:element> </xs:schema>

Running xjc from the command line then generated 5 Java classes that would
allow me to read/write this format. There was were just a couple of snags:
- The generated classes Robot and Servo had no behaviour. In my existing
codebase these classes needed behaviour and editing generated code is clearly
a bad idea.
- My hand written classes used primitive int and double but the
generated classes used java.lang.Integer and java.lang.Double. [1]

There is a section in the Unofficial JAXB
Guide
on adding
behaviour
to generated
classes which suggests that the solution to my first problem was to define
classes which extend the generated classes to add the behaviour and then
create a custom factory class that makes JAXB instantiate instances of the
subclasses instead of the generated classes.

The solution to the second problem seemed to be to use JAXB’s support for
customisation to force it to use primitive int and double for the attribute
values.

You can customise
how JAXB generates classes using annotations in the schema or an external
binding file. Since I was generating my schema from Relax NG and, in any case
I wanted to keep the schema clean, I elected to go for an external binding
file.

I wanted to keep convenient names for my hand-written classes and so I elected
to use customisation to both change the data types of the generated classes
instance variables and to rename the generated classes to have the suffix
“Element.” Here is my final attempt at a JAXB binding file to do this:

<jxb:bindings version="1.0" xmlns:jxb="http://java.sun.com/xml/ns/jaxb" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <jxb:bindings schemaLocation="robot.xsd" node="/xs:schema"> <jxb:globalBindings> <jxb:javaType name="int" xmlType="xs:integer"/> <jxb:javaType name="double" xmlType="xs:decimal" /> </jxb:globalBindings> <jxb:schemaBindings> <jxb:package name="org.emptiness.hexapod.autogen.robotmodel"> <jxb:javadoc> <![CDATA[<body> Package level documentation for generated package org.emptiness.hexapod.autogen.robotmodel.</body>]]> </jxb:javadoc> </jxb:package> <jxb:nameXmlTransform> <jxb:elementName suffix="Element"/> </jxb:nameXmlTransform> </jxb:schemaBindings> </jxb:bindings> </jxb:bindings>

Unfortunately, although this did correctly generate classes called
RobotElement and ServoElement the element attribute values were still
mapped to java.lang.Integer and java.lang.Double - some more googling
found this forum
post
and another
one
–instead-of-
class-types-td20816710.html) which indicated that JAXB was forcing the use of
classes since the attributes were optional and might therefore need to be
null. There did not seem to be any way around this and so I reluctantly
decided to leave things as they were and be grateful that Java supports
autoboxing so that I could get away without having to modify any code using my
Servo class.

I created my custom factory:

`
package org.emptiness.hexapod.core;

import javax.xml.bind.annotation.XmlRegistry;

import org.emptiness.hexapod.autogen.robotmodel.ObjectFactory;
import org.emptiness.hexapod.autogen.robotmodel.RobotElement;
import org.emptiness.hexapod.autogen.robotmodel.ServoElement;

@XmlRegistry
public class RobotObjectFactory extends ObjectFactory {
/**

  • Create an instance of {@link RobotElement }
  • */
    public RobotElement createRobotElement() {
    System.err.println(“RobotElement createRobotElement”);
    return new Robot();
    }

/**

  • Create an instance of {@link ServoElement }
  • */
    public ServoElement createServoElement() {
    System.err.println(“ServoElement createServoElement”);
    return new Servo();
    }
    }
    `

But try as I might I could not get JAXB to use to create instances of my
classes instead of the ones it had generated. By this time I was thoroughly
fed up with JAXB and ready to try something else. A little further
reading
suggested that
the way forward might be to use annotations to mark up my handwritten classes
and abandon the whole idea of getting JAXB to generate the data bearing
classes. After some messing about I had a new Servo class that looked a little
like this (for brevity I’ve removed all the constant declarations, behaviour
methods and getters/setters from the code below):

`
package org.emptiness.hexapod.core;

import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlAttribute;
import javax.xml.bind.annotation.XmlRootElement;
import javax.xml.bind.annotation.XmlSchemaType;
import javax.xml.bind.annotation.XmlTransient;
import javax.xml.bind.annotation.XmlType;
import javax.xml.bind.annotation.adapters.XmlJavaTypeAdapter;

import org.emptiness.hexapod.core.jaxb.DoubleAdapter;
import org.emptiness.hexapod.core.jaxb.IntegerAdapter;
import org.emptiness.hexapod.device.ServoController;

@XmlAccessorType(XmlAccessType.FIELD)
@XmlType(name = “”)
@XmlRootElement(name = “servo”)
public class Servo implements RobotComponent {

// configuration information
@XmlAttribute(required = true)
@XmlSchemaType(name = “anySimpleType”)
private String id; // identifier passed to the controller

@XmlAttribute(required = true)
@XmlSchemaType(name = “anySimpleType”)
private String name; // meaningful human name

@XmlAttribute(name = “min_angle”)
@XmlJavaTypeAdapter(DoubleAdapter.class)
@XmlSchemaType(name = “decimal”)
private Double minAngle; // (radians)

@XmlAttribute(name = “max_angle”)
@XmlJavaTypeAdapter(DoubleAdapter.class)
@XmlSchemaType(name = “decimal”)
private Double maxAngle; // (radians)

@XmlAttribute(name = “rest_angle”)
@XmlJavaTypeAdapter(DoubleAdapter.class)
@XmlSchemaType(name = “decimal”)
private Double restAngle; // (radians)

@XmlAttribute(name = “min_pulse_width”)
@XmlJavaTypeAdapter(IntegerAdapter.class)
@XmlSchemaType(name = “integer”)
private Integer minPulseWidth; // (microseconds)

@XmlAttribute(name = “max_pulse_width”)
@XmlJavaTypeAdapter(IntegerAdapter .class)
@XmlSchemaType(name = “integer”)
private Integer maxPulseWidth; // (microseconds)

@XmlAttribute(name = “max_speed”)
@XmlJavaTypeAdapter(IntegerAdapter .class)
@XmlSchemaType(name = “integer”)
private Integer maxSpeed; // (max change in pulse width per second)

// current state
@XmlTransient
private int actualPosition = UNKNOWN_POSITION;

@XmlTransient
private int desiredPosition = UNKNOWN_POSITION;

@XmlTransient
private int speed = DEFAULT_SPEED;

@XmlTransient
private ServoController controller = null;
}
`

Since I was now annotating classes which contained state that should not be
persisted I had to use @XmlTransient to prevent JAXB from attempting to
persist those instance variables.

The Robot class now looked like this (again with large chunks of code
removed):

`
package org.emptiness.hexapod.core;

import java.io.File;
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.util.ArrayList;
import java.util.List;

import javax.xml.bind.JAXBContext;
import javax.xml.bind.Marshaller;
import javax.xml.bind.Unmarshaller;
import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlAttribute;
import javax.xml.bind.annotation.XmlRootElement;
import javax.xml.bind.annotation.XmlSchemaType;
import javax.xml.bind.annotation.XmlType;

@XmlAccessorType(XmlAccessType.FIELD)
@XmlType(name = “”, propOrder = {
“servo”
})
@XmlRootElement(name = “robot”)
public class Robot {

protected List servo;

@XmlAttribute(required = true)
@XmlSchemaType(name = “anySimpleType”)
protected String id;
@XmlAttribute(required = true)
@XmlSchemaType(name = “anySimpleType”)
protected String name;

public Robot() {

}

// file operations
public static Robot load(File inputFile) throws Exception {
//JAXBContext context = JAXBContext.newInstance(ObjectFactory.class);
JAXBContext context = JAXBContext.newInstance(RobotObjectFactory.class);
Unmarshaller u = context.createUnmarshaller();
//u.setProperty(“com.sun.xml.bind.ObjectFactory”,new RobotObjectFactory());
Robot robot = (Robot) u.unmarshal(inputFile);
return robot;
}

public void save(File ouptutFile) throws Exception {
JAXBContext context = JAXBContext.newInstance(RobotObjectFactory.class);
Marshaller m = context.createMarshaller();
m.setProperty(“jaxb.formatted.output”, Boolean.TRUE);
OutputStream os = new FileOutputStream(ouptutFile);
m.marshal(this, os);
}

public List getServos() {
return getServo();
}

public List getServo() {
if (servo == null) {
servo = new ArrayList();
}
return this.servo;
}
}
`

Since the servo element is called “servo” JAXB expects the getter methods for
accessing the list to be called “getServo” - I, however, felt the method was
more naturally called “getServos” and this was what I used in my existing
code. I therefore created a wrapper method to allow me to continue using
“getServos” The load() and save() methods above show the small amount of code
actually required to invoke JAXB to read or write the data from/to a file.

So, I now have the ability to read and write my data to XML without having had
to write yet more code to implement a SAX interface or generate XML with
PrintWriter.println() but getting things working with JAXB took me far longer
than the “SAX + println()” approach would have done and I can’t say that I’m
100% happy with the final result. I may eventually try another approach like
Xstream and see if that gives better results
with less overall hassle.

Links
- Official JAXB home page
- Wikipedia page on JAXB
- FAQ entry on adding behaviours to JAXB generated
classes

- A practical guide to JAXB 2.0 (The
Register)

- XML annotations
javadoc

- Xstream

View the discussion
thread.

Review & plot summary: The Reader

The Reader is not a film about
redemption, nor forgiveness perhaps not even understanding. It is a film, I
think, about shame, regret, love and the times that love is not enough to stop
us from hurting those we love or enough to make us give the comfort the loved
one seeks. It is a sad film, an emotional film, the sort of film that makes us
wonder why and that is perhaps the best kind of film there is.

Michael, a young German, is only fifteen when, by chance, he meets a much
older woman, Hanna, who helps him when he is sick. Months later they meet
again and begin a passionate affair that lasts the whole summer. Hanna is not
always an easy woman to be with, but she loves to be read to and it becomes
clear that she loves Michael at least as much as he comes to love her though
she seems naturally suspicious and finds it hard to express her feelings. It’s
all the more puzzling then why she suddenly leaves town without a word on
Michael’s sixteenth birthday. Is it because he is sixteen, because he avoided
his own birthday party to be with her or because she was promoted from being a
ticket collector on a tram to office work. Whatever the reason she leaves
abruptly with no explanation.

Eight years later, in 1966, Michael is a law student and attends a Nazi war
crimes trial he is shocked when Hanna appears as one of the defendants accused
of being a guard at a concentration camp and of allowing over three hundred
women to die in a fire. There are five other defendants who deny the charges
against them, refuse to answer questions and point to Hanna as being the one
in charge. Hanna for her part appears to answer the questions of the judges
honestly and seems upset.

A lot of the above is apparent from the publicity for the film which makes
much of the big secret that Hanna is protecting. If I’m honest this is why I
came to see the film as I’m a sucker for mysteries - curiosity always gets the
better of me. However, it’s also apparent by this point that the secret is no
big secret at all and in fact the answer has been in front of us all along. To
be honest if I’d known the “secret” was so banal I wouldn’t have gone to see
the film and I would have been the poorer for it, since it is a film well
worth seeing - the performances are stunning and the characters have a real
emotional depth; they feel pain and you feel their pain with them.

Michael makes an attempt to see Hanna in prison but find himself unable to go
through with it and meet the woman he loved knowing what he now knows. Hanna
is found guilty and sentenced to life imprisonment.

Years later, Michael has been married, had a daughter and divorced. He finds
his old school books at his parent’s house, records himself reading them and
and sends the tapes to Hanna. The tapes give Hanna new life and she begins to
write to Michael but he never replies or sends any letters of his own or any
personal message of any kind, just the tapes of him reading the books.

In 1986, Hanna has been in prison for twenty years and is on the point of
being released. The prison contacts Michael as the only person who has had any
contact with Hanna in all this time even though he has never visited her and
asks if he can help her on her release. Eventually, Michael manages to force
himself to visit Hanna but though she loves him still he can’t bring himself
to show her much warmth and though he’s arranged a place for her to live it’s
clear that he’s not offering her any place in his life. A week later Michael
comes to collect Hanna from prison, after decorating her new home, perhaps
regretting his previous coldness he brings her flowers only to find she has
committed suicide. She has left a note instructing the warden to tell Michael
that she said “hello” and to give her remaining money to Michael for him to
give to a woman who survived the fire.

Michael tracks the woman down to give her the money and though the woman is
not able to offer any forgiveness for Hanna Michael does manage to find a kind
of release.

Finally, in 1995, he takes his daughter to Hanna’s grave and starts to tell
her what he has told no one else about a summer when he was fifteen years-old,
met a woman and fell in love for the first and possibly the only time in his
life.

One thing I liked about The Reader was that at no time did it attempt to
excuse Hanna’s actions, claim she was not a bad person really or say that she
was not aware of what she had done. In fact apart from the excerpts from the
war crimes trial we are not told very much at all. It does seem that Hanna
regrets her actions but most of all we are shown the effect on our lives that
a simple choice can make: during the trial it was shown that just before she
joined the SS Hanna was offered a job at Siemens but chose the SS because it
was “a job” and from that one choice everything else flows. It’s never spelled
out exactly why Hanna made that choice it seems clear, once we know her
secret, that working for Siemens would have exposed it and made her shame
public.

You want to know the secret? If you haven’t worked it out you’ll just have to
go and see the film won’t you!

View the discussion
thread.

Crema

I finally discovered that the secret to getting a decent crema out of my De’Longhi coffee machine was to load in about 30% more coffee that I would normally do. I spent way too long thinking it was all about how hard I tamped down the coffee and trying either a lot of pressure or very little - that seemed to affect how fast the coffee came out of the machine but very little else. The De’Longhi seems to have a fairly feeble pump and if you tamp the coffee down too much the coffee drips out at an infuriatingly slow rate (once I got no coffee and just a lot of steam escaping from the gasket).

Update 6/11/2021: Maybe you’re wondering what types of coffee drink exist. In this case you might find this article The Ultimate Guide to Coffee in Italy useful since it covers what drinks you can get in Italy and where to get them.

Robot Configuration

Moving servos using ASCII commands typed into a terminal emulator to control
the servo controller via a serial link soon lost it’s appeal.

I’m currently working on a little Java Swing app to allow me to configure a
robot and interactively adjust servos in order to derive configuration
parameters such as minimum/maximum movement angle and the best “rest position”
to start the servo up in.

\(PNG\)

RoboConfig UI (such as it is currently)

There’s currently not much to the application - a tree widget shows the robot
structure and clicking on tree nodes shows the appropriate editor pane on the
right. Currently, I can only create servo configuration data but the goal is
to eventually allow me to manipulate a representation of the mechanics of the
robot and allow some simulation of the hardware to make algorithm development
easier.

The software uses RXTX to find serial interfaces and
communicate with the servo controller.

Swing programming not much fun either

In one of my first blog entries (Web programming == No
Fun
) I moaned about the relative painfulness of creating a
web-based user interface compared to doing the same thing in a desktop UI
framework such as Java Swing (which I was most used to at the time).

Now, a few years (but not that many blog entries) later I find myself in the
opposite situation. I’ve spent most of the intervening years building web
applications in Java, Javascript and PHP and I found myself needing to dust of
my desktop-UI skills again for some work I was doing on my Hexapod
robot
project.

I’m now at the stage where the appeal of typing commands to the servo
controller using a terminal emulation and a serial link has worn off somewhat.
Yes, I can make the robot twitch a bit and the mechanics seem to be working
but some more interesting behaviour would be a definite plus. It seemed like a
good idea to write some software I could run on my PC to configure and control
the robot so I could learn how to program interesting behaviours without
having to learn a new embedded programming environment at the same time. It
wasn’t long before I decided I needed a file format to store the robot
configuration in - things like mapping meaningful names to the numeric IDs
used by the servo controller and storing the min and max joint angles for each
servo etc. After messing about for a while with a terminal emulator and the
robot I decided that I needed some form of interactive interface which would
let me easily set up a configuration and manipulate the robot to find the
correct servo settings.

Without a great deal of thought I picked Java (could have been C++ or ruby but
I kind just settled with Java out of habit) and Swing (AWT is a UI toolkit
that should never, IMHO, have seen the the light of day and I’ve no experience
with SWT).

Putting on my rose tinted glasses and remembering how much I used to enjoy
Swing programming I fired up eclipse and had a bit of a shock - somehow Swing
programming didn’t seem nearly as much fun as I’d remembered. Having gotten
used to layout out my user interfaces with HTML and CSS using Swing layout
managers, for example, seemed like an exercise in masochism.

So what has happened? Why the shock? Well, firstly (and most obviously)
whatever we are most used to often feels easiest (duh!). When I wrote Web
programming == No Fun
I was still pretty new to Java web
development, especially STRUTS and had recently developed a fairly complex
Swing application. Now I’ve had a few years to forget a lot of my Swing
knowledge and am a more experienced web developer.

I think there’s more to it then just familiarity though. Web development has
changed quite a bit for me in the last few years. Although I started out with
STRUTS more recently I’ve used Spring MVC (and a proprietary framework at
Ecube Ltd, where I’m presently employed). I would
argue that STRUTS, at least the version I was using back in 2005, is painful
and involves the creation of far too many pointless ActionForm classes and XML
configuration [2]. Personally, I find SpringMVC far better suited to the way I
want to write code.

Libraries such as prototype,
scriptaculous and
YUI and so on have made it a lot easier to
incorporate more complex user interface widgets in web applications. Also, the
event based nature of AJAX programming it’s not that different from the event
based programming you’d do for a Swing application. Sure, the actual events
are different but it’s a not the same as spending your time building a UI in
which you have to (re)build the complete page as a response to every
significant user action.

I think Swing [3] is badly lacking the ability to separate structure from
appearance that you gain with HTML, Javascript & CSS. Actually getting things
to look good and appear where you want them to can be painful with both Swing
and HTML/CSS but at least with HTML it’s easy to knock up a functional UI and
tweak the appearance without having to rewrite too much of the code that
actually generates the UI. With Swing how things look is too closely bound
with the code that creates the widgets.

Swing does perhaps have the edge when it comes to complex widgets such as
collapsible trees which are updated when the underlying developer-supplied
model changes but that also brings additional complexity and I spent a good
few hours wrestling with JTree and the associated TreeModel and events before
I got things working the way I wanted.

I suppose I could have used a GUI builder when writing my little configuration
app but I’m suspicious of such tools and find that, at best, they are good for
producing a first cut of the UI after which you usually end up tweaking some
fairly opaque and/or verbose code by hand. In the same way I code HTML & CSS
by hand also - it may be more painful at times but at least I remain in
control.

BTW if you want a look at the little app that inspired these ramblings have a
look at Robot Configuration - it’s still somewhat
krappy and incomplete but it’s progress of a sort anyway.

View the discussion
thread.

Wiring up the controller

I’d originally planned to run all the servo connections under the legs and up
through the hub, however that seemed like a recipe for snagged cables and so I
gave up on that idea and ran the cables from the “hip” servos up through the
hub and the overs over the top of the legs.

\(JPG\)

plan view

Here you can see the cables from the outer leg servos running over the top of
the leg.

\(JPG\)

Leg plan view

This side view gives another view of the way the cables are arranged. I ended
up depending on cable ties way more than I would have liked. There’s one on
the outermost servo to keep the cable from getting trapped in the leg joint.
One each on the top and bottom of each hip to keep the cables from ending up
all over the place and another to keep the top pair of cables together in the
hope that they are less liklely to snag things.

\(JPG\)

Leg side view

This shows the wired up robot with a high stance.

\(JPG\)

High stance

Here with a low stance.

\(JPG\)

Low stance

I’ve tested the
SSC-32
controller with a single servo, now I just need to check that it can drive all

  1. The next step is to program the thing to walk tethered to a computer and
    then to start adding the final control electronics - currently planned to be a
    GP2X Linux game handheld - to the robot
    so it can operate autonomously.

A tale of two controllers

When I ordered the mechanics of the hexapod from
Lynxmotion I knew I wanted a fairly powerful
microcontroller that could handle at least 18 servos. The
servopod(tm) from New Micros
Inc
looked very interesting - it can control up to
26 servos, has 22 GPIO lines, 2 x 8-channel 12-bit A/D converters.

\(JPG\)

ServoPod(tm)

Photo (c) New Micros Inc

Another intriguing aspect of the ServoPod was the fact that it’s native
language is a dialect of Forth designed to handle parallelism. It’d been a
long time since I’d written anything in Forth (not since my 8-bit home
computer days in the mid to late 1980s in fact) and I hadn’t planned to start
again but IsoMax sounded like it could be a good way to handle tasks such as
getting the robot to move and still be able to sense it’s environment without
requiring a processor capable of running multiple threads or rolling my own
pseudo threaded controller code.

However, when the ServoPod arrived there was disappointment in store.
Lynxmotion had recommened their SSC-32 controller to me but I wanted the I/O
capabilties of the ServoPod and so I went against their advice. Firstly, the
documentation was very incomplete and at least half the sections didn’t appear
to have any content. Instead it seemed to be expected to use the forums to
work out how to use the thing for any real tasks. When I came to revisit the
project after over a year’s hiatus the documentation had improved but the
information revealed only served to deepen my dissatisfaction. For example:

- Why, for a device called “ServoPod” was it necessary to run in “slow”
mode in order to control servos reliably? One reason for buying the controller
was to have a reasonably powerful processor so why have throw that advantage
away in order to control servos? Surely, given the name of the controller it
would have been optimised for controlling servos without having to run at half
speed? To be fair there were hints outside the documentation that it might be
possible to control servos at full speed, but I didn’t see a definitive
answer.
- Why was there not a decent tutorial on actually connecting and
controlling a servo in the manual without having to trawl through the forums?
- There were comments (in the forums I think) about having to attach
separate power supplies to actually power the servos, but again there was a
lack of documentation.

I’m sure I could have progressed further but I didn’t have a lot of spare time
and I just wasn’t feeling in the mood to mess about with something that had
already disappointed me. I decided instead to try the
SSC-32

\(JPG\)

SSC-32 Servo controller

Photo (c) Lynxmotion

The SSC-32 is a dedicated servo controller and so it doesn’t have the I/O
options of the ServoPod (it only has 4 inputs), but it does have some nice
features:
- ability to control up to 32 servos
- each bank of 16 servos can either share a single power supply or have
one each in addition to a separate supply for the logic (this is also true of
the ServoPod)
- dead easy to control via a serial port
- ability to handle group moves

If possible I’d rather have a single powerful processor running most of the
code so having a relatively dumb controller that just controls the servos is
not such a bad prospect. I still need to sort out something to handle input,
but I might end up using the Servopod for that. I’ve also just got my hands on
an arduino so that’s a possible
candidate, but I’ll likely use that for experimenting with other stuff.

An introduction to OSGI

I gave a talk about OSGi (dynamic module system for Java) for the London Java Community on 28th October.

I had hoped to cover some more advanced stuff such as exactly what happens when modules are replaced in a running application and code is trying to make calls into the module being replaced but I ran out of time preparing the presentation and wasn’t able to put together some code for this.

You can get the presentation here (PDF format).

Here are some useful links (taken from the last page of the presentation):

The javaworld articles are a good introduction and most of the code examples in the presentation are from there.

Lifehouse method

I worked with Pete Townshend and Lawrence Ball to create lifehouse-method.com a realisation of Pete Townshend’s Lifehouse) concept and also “the method” as described in Pete’s “The Boy Who Heard Music”. Pete was responsible for the vision, Lawrence provided the music direction and I did almost everything else (managing the project and the 99% of the programming). Fleur Richards from Net Design did the graphic design and a couple of developers from Net Design produced the Flash applets for sound selection and “clicking a rhythm.” Finally, Javier Seplveda produced the Java applet used to record audio from within a web user interface.

Since lifehouse-method.com was shut down earlier this year and the wikipedia article doesn’t have that much detail I thought it was time to write a bit more about the project. Due to contractual constraints I can’t say much about how the system worked but I can at least show what it was like to use.

The home page started out with a Flash applet that let you play snippets of Who tracks but it was never really related to the method itself and it got replaced by a news page after a few months.

lifehouse-method.com home

After logging in (registration was free) you were presented with a page listing music you’d already composed (if any) and allowing you to “sit” for more - Pete likened the composition process to sitting for a painted portrait and so users of the site were referred to as “sitters.”

Sitter home

The first step was just a page giving some information about the portrait process.

Portrait intro

The next page checked that the browser supported Javascript and Java. Originally we also tested for the quicktime plugin but this got dropped after a while since the use of Flash made it unncessary.

I wish now that we’d handled this in a different way - checking for Javascript as soon as people entered the site and minimising the process that people had to go through every time they “sat” for a “portrait.”

Browser test page

The portrait process involved given the system an sample of a voice, an image, a sound and a rhythm. I was instructed to make the system as much like “the method” in “The Boy Who Heard Music” as possible so that explains the rather odd input to the system. Since we did not want to limit the use of the system to musicians we did not use the sound, voice or rhythm in the generated music but used digital signal processing to extract information about the input that was used to create the music.

The first real stage of the portrait process was to record a sample of your voice - you could skip this step if you didn’t have a microphone. Originally, we planned to provide a set of male and female sampled voices so people without the ability to record audio could choose the voice that they liked best - however, Pete hated the idea after listening to some demo recordings and so we just made the system capable of working without a voice sample.

Record voice

The next step was to upload an image. I’d wanted to provide an easy way to grab images from the sitter’s flickr photostream if they had a flickr account but never found the time to do this.

Upload image

In case not everyone had an image to upload, we also allowed people to select 1-3 images from 20 randomly selected images out of a total of 100 that Ghene Snowdon created.

Select image

After uploading or selecting an image it was time to record a sound - this worked as for recording the voice but this time we provided alternatives for people who weren’t able to record a sound or upload a sound file.

Record sound

We tried to make selecting a sound as fun as possible - we presented them as a 10x5 grid (a Flash applet). Moving the mouse over a sound would cause it to play in a loop. Clicking on a sound would select it. The last three sounds selected were highlighted in red and these were used as the sound input to the composition.

Select sound 1

The sound grid with some sounds selected.

Select sound 2

The final step was to record a rhythm - this could be done by recording the sitter clapping or banging something using the java applet, uploading a sound file or “clicking a rhythm” using the mouse.

Record rhythm

This screen showed a Flash applet that would record the relative time between mouse clicks allowing the user to create a rhythm by clicking the mouse and play it back before finally deciding to save it.

Tap rhythm

At this point the system had everything it needed and the sitter got to see this page while the system was doing its stuff.

Composing

Finally, the music is ready and the sitter can listen to it by clicking on the big red play button. Rather than producing MIDI files and leaving the playback quality dependent on whatever sampled instruments were present on the sitter’s computer’s sound card we went to quite a lot of effort to ensure good quality playback. Steve Hills created some new instruments in SoundFont2 format and these were used to generate MP3 files. The software was also capable of panning instruments to separate them in stereo “space” and choosing a volume level for each instrument in an attempt to make the end result as good as possible. A later, experimental, version of the system also used
compression.

Listen

lifehouse-method.com was officially launched at Pete Townshend’s Oceanic studios on 25th April 2007. John Pidgeon “sat” for his musical portrait in front of about 20 journalists. After listening to the portrait composed for John Pidgeon the audience got to listen to a remix of another John Pidgeon piece by Myles Clarke.

Pete Townshend introducing the event and giving some of the history behind Lifehouse

Pete Townshend (photo by G. Snowdon)

John Pidgeon listening to the music the system has composed for him.

John Pidgeon (photo by G. Snowdon)

Lawrence, Pete and John taking questions from the audience

Lawrence, Pete and John take questions (photo by G. Snowdon)

Some of the audience

Audience (photo by G. Snowdon)

Here are a few examples of music that the system composed for me (click on the
triangles to play):

If you can see this then Flash is probably not enabled on your web browser |
Tune #1
—|—
If you can see this then Flash is probably not enabled on your web browser |
Tune #2
If you can see this then Flash is probably not enabled on your web browser |
Tune #3

You can also find some more music produced by lifehouse-method.com at the
Lifehouse group on vox.com

The lifehouse-method.com servers were shut down in June 2008 and the only thing remaining at that URL is a page saying the site is no longer operational.

At the time we produced lifehouse-method.com the portrait process seemed OK and we did not have time to do anything better. I’d always hoped to go back once the site was launched an re-do the interface to make it more streamlined. For what it’s worth here’s a list of some of the things I was planning to implement that never saw the light of day because the site was shut down before I finished them:

  • the ability for people to allow others to listen to their music
  • a flickr-like way for people to comment on music
  • a way to see & hear the inputs used to create a piece of music and to see a representation of how lifehouse-method.com saw those same inputs (ie a visualisation of the data extracted from them).

Update: 27/11/2011: It was Steve Hills, not Myles Clarke, who created the Soundfont2 files. Sorry Steve! Myles was involved in all other aspects of the project related to audio production.

Update: 5/02/2012: Lawrence Ball has an album, Method Music, released by
Navona Records Visit www.navonarecords.com/methodmusic to visit the album’s mini-site for the liner notes, extra media, and more. The album is also available on Amazon UK and US

Update: 28/01/2021: Thanks for Richard Evans for correcting my comment below. The interview by Carrie Pratt appears on a fan website petetownshend.net and NOT Pete’s official website which is thewho.com.