We had a pretty bad storm over the weekend. For our part, we lost some parts of trees but not much in the way of property damage. We didn't lose power. I have friends whose power is out and have no visible sign of relief.
This is the garden. Just to give you a flavor of what the snow looked like. If you look in the background you can see the body of a swamp maple. More on that in a moment.
This is the persimmon tree next to the garden. It was twice as tall as it is now. Very nice persimmons.
This is a chestnut tree we planted several years ago. It, too, is about 1/2 its size.
Finally, here is the swamp maple I referred to. 1/2 split in the middle. This picture doesn't do it justice since it just shows 20 feet of the main trunk. The tree is about 60 feet tall so this is about a third of what's on the ground.
We're glad we made it through but many people fared worse than we. One friend of mine had a tree fall on the line that fed his house. It ripped off the circuit box. The city will string line to the box but won't fix the box so he's stuck until he can get somebody to repair it.
My friends in Hartford described their neighborhood as a "war zone." No power. Houses broken. Cars smashed. Power lines down. Street cordoned off.
And there are still those who would have us dissolve FEMA instead of making it work.
For anybody out there with 1/2 a brain, the first human impact manifestations of global warming isn't temperature; it's increase in chaotic weather. Massachusetts had its first tornado in generations and this particular storm is unprecedented. Add it to the 500 year floods we've been seeing every few years.
And we huddle right on the tracks in front of this oncoming train, fingers in our ears and eyes tightly closed, chanting to ourselves, "There is no train. There is no train. There is no train."
Sunday, October 30, 2011
(Picture from here.)
I am obsessed with Hatsune Miku.
A quick history.
Back in 2000, Yamaha started developing vocaloid technology-- a synthesizer aimed at recreating a singing voice. By mid-2000s they technology had been encapsulated in two "virtual soul vocalists", Leon and Lola. What's interesting here is that almost immediately upon creating a singing synthesizer it was personalized into a fictional entity. Crypton Media created their first vocaloid, Meiko, in 2004. Hatsune Miku was release as the first of a "character vocal series" in 2007. Her name roughly translates to "first future sound." There is imagery for her: big anime eyes, blue hair that's about four feet long. Her voice was synthesized from sampling the voice of Saki Fujita. She's targeted to be a young teen and her voice was modified to fit that role.
Okay. We know how fictional entities can take Japan (and the US) by storm. Think Pokemon. Think Yugioh. But there aren't very many "live" concerts for Pokemon. There has been for Miku.
Starting in 2009 Miku performed on stage with a live band by projecting an animated image on a scrim.
This is only the start of what's interesting. Think of the technology. Was Miku rendered, recorded and projected and the band kept up like in silent movies? Was she pre-rendered and then the performance locked into the band's work? Was there real time rendering involved? I have no idea.
Think of the music. Enjoyable pop, by and large. There have literally thousands of songs written for Miku as the vocaloid technology is readily available. How were the songs chosen? Is this crowdsourcing of vocal material? Have we reached the age where pop songs can literally grow from the grass roots up?
Think of the celebrity phenomena. Miku, and her three "costars", play for a live audience sold out for months in advance. People paying top dollar for a concert given by a fictional creation.
But for our purposes today I'm just going to talk about the interaction between Miku and her audience.
Humans are tokenized communicators. We don't exchange real things; we exchanged symbolic tokens. Even real objects are imbued with layers and layers of symbolism. If I'm in the garden working with two trowels and you are next to me digging with your fingers, I might give you the second trowel. That simple act has a wealth of information associated with it. By doing so I've acknowledged you have a lack. I've recognized your need and moved to fill it. I've stated willingness to share. And you, as the recipient, instantly grasp this information. All by the movement of the token trowel.
Of course, humans build tokens out of other tokens, have multilayered levels of communication, deceive, obfuscate and otherwise mess with communication. I could make a strong argument that everything we do, everything that makes us human, has something to do with this tokenized communication with each other. It's what we do best. I'd hazard a guess that this ability has been selected for and is one of the reasons we've evolved this oversized brain of ours.
Terry Winograd and Fernando Flores wrote Understanding Computers and Cognition, one of the most brilliant discussion of human communication I've ever read. I haven't found much discussion of it on the net but there's a good article here that talks about it.
One of the points they made was that many of the mechanisms we take for granted are in fact tokenized communications between the developer of the mechanism and the user. If you use a microwave oven the way the oven works, how to set the time, the power, what power settings are available, the shielding, are in fact a model of how the oven should work. A finite set of people have communicated to you, the user, that model.
A modern three minute song consists of hook, melody, various bridges and chorus. It's a simple model of a complex phenomena. However, the modern song itself is a simplification of earlier song forms that go back hundreds of years. Regardless, the song is a collection of tokens that have been grouped together into a single entity that is intended to convey a communication to the listener.
Music binds time: it has a beginning, a middle and an end. It is repeatable-- and not. Is the experience of hearing a song the first time the same as the second? The fiftieth? How about comparing a studio recording with a live performance? We recognize the song is the same even though we also recognize it is different. Music evokes tremendous activity all the way from time perception and auditory processing to activating memory and emotion and higher cognition. That's a lot of activity caused by a three minute sound.
Miku is a performance organizing principle on stage. The band members are fully human and they play right along side. They are, in fact, virtuoso musicians. My son pointed out that Miku was nothing without a good band behind her. I pointed out that's true of most pop singers.
In the most recent vocaloid concert (video here.), Miku is brought back for an encore and seems to stop and look away as if overcome with emotion. (See here, about 4:23.) The audience loves it.
So, what is Hatsune Miku?
She's not an Artificial Intelligence. Not that there's not a lot of intelligence in her software but she's no AI. Someone programmed that behavior into her. Someone made a communication to the audience with Miku's actions as the token.
Alan Turing, and many of the early computer scientists, were concerned with computer consciousness. How would they ever tell? Turing devised what has become known as the Turing Test. The idea of the Turing Test was you take a teletype in a room and use it as a conversing medium to something connected to an entity out of sight. If the entity is indistinguishable from a human being it is, in effect, human regardless as to whether it is in silicon or not.
I've never liked the Turing Test for a lot of reasons. For one reason, it essentially says we can't define consciousness but we, as humans, know it intrinsically. So we'll use that innate knowledge to recognize it. This is weak analysis. It's analogous to those that say, I can't define pornography but I know it when I see it. It may or may not be true but it isn't much in the way of science.
It also presupposes we can in fact do that recognition. Humans project their own qualities willy nilly on the world. We push human qualities to the universe, our pets, dolphins, elephants, lizards, rocks and the sun. Why do we think we'd do any better with a teletype?
It also presupposes a particular kind of consciousness and intelligence: human. Other species might fail the test and still have intelligence and consciousness.
Finally, it is prone to failure on its own merits. If some genius writes a program that can mimic a human interaction over a teletype-- some super Watson, for example-- does that my definition conscious? Think spam bots. Some of them are pretty damned good-- a limited venue, to be sure. But though we're not in the country of such computer programs, we can see it from where we stand.
If we substitute a song for a teletype, does Miku pass the Turing Test? Let's push further into the music. Miku fronts for the band but she doesn't interact with it. She can't depart onto an unrehearsed solo and have the band twig to when she hands it off. Almost any garage band can do that and some of the best bands, like The Who or Iron Butterfly, could make whole concerts out of handing improvisations one member to another. A whole field of music, jazz, was in part founded on it.
So, let's say future Miku improvements do allow her to act independently within this musical Turing Test. Would her passing define her to be conscious and intelligent?
I would say consciousness is not proven but intelligence is. Like the teletype program I suggested before, within that particular and narrow venue Miku would be every bit as intelligent as a human. Consciousness implies self-awareness of the experience and I don't think that's proven by the Turing Test in any way.
But, returning to Winograd and Flores, any object we create can (and likely is) a token of communication between people. Miku, as she now stands, is a communication mechanism of the song to the audience. The song is a creation of a set of individuals: the members of the band, the composer, the arranger, the programmer that taught Miku the song and the animator that produced her visualization.
Since I'm obsessed with her, I'll say the communication is good.
Wednesday, October 26, 2011
I've been hearing about how great Steve Jobs was for years. Yeah. Yeah. Huge impact. Huge.
But two people died in the last few weeks and one of them had a tremendous impact on everything you touch. The other was Steve Jobs.
It was Dennis Ritchie.
Let us go back to the late Triassic of computers: the 1960's. Remember that line in Apollo 13 about how great the space program to create a computer that could fit in a single room? That was the time.
Computers do essentially five things: test, branch, put to memory, fetch from memory and add. Everything else computers do is a variation or extension of these five operations. Back in the day, there were essentially three languages: COBOL, FORTRAN and assembly language. COBOL was used in business. FORTRAN was used in math and science. Everything else, from hospital information systems to a whole lot of billing and tracking of population data was done in assembly.
Assembly language is essentially a symbolic representation of machine operations. It is machine specific. For example:
MOV r1, #0x01139ba
are essentially the same functional instruction. The first is move contents of register 1 to the address. The second is transfer the contents of the A register to the address.
Portability of assembly language was never an option. Not only was the syntax functionally different (I won't bother to descend in the hell of opcodes) the implementation of the functionality was different. While every computer had to execute those five functions I listed above, it's not easy to implement complex functionality in just those five ways. In addition, each machine manufacturer had its own idea of how to manage the problems. How to address memory? Use memory for the add function? It would be faster to do it in specialized electronic components called registers. (Hence the TAM instruction above.) There was more in common between a dog and a horse than there was between IBM 360 assembly and Harris assembly.
At least both the dog and the horse have the same number of legs.
What assembly language had to offer was to present no impediment between what the programmer wanted and what the fundamental machine provided. COBOL and FORTRAN provided a layer of insulation between the underlying machine and the programmer. That was fine if you didn't want to look at what the machine could do but not so great if you had to access that functionality. Some FORTRAN compilers allowed the programmer to include assembly language right in FORTRAN code.
Enter Dennis Ritchie.
The need for computer languages that operated at a higher level than assembly (and was not COBOL or FORTRAN but something more formal) had been recognized for some time. ALGOL was an early idea.
Dennis Ritchie and Ken Thompson looked at ALGOL. Liked some of it but thought the rest was cumbersome. They took the best parts of ALGOL and their own preferences and created the programming language C.
C was a portable language. It was simple. It was versatile. And it took the computer world by storm. C was first released in the early seventies. It gave all of the usual functionality of assembly languages without the platform dependence. If you wanted to write a program on one machine and then have it run on a second machine from a different manufacturer it was murderously painful.
Even better, it was structured. You have to understand at this point in programming there was a war being fought between structured and unstructured programmers. Structured programmers tried to write code that was readable and, hopefully, elegant. Unstructured programmers thought structure made you weak. Comments (annotations in the code that told you what the code was doing) were considered unmanly.
Think about this code:
i = 10;
monetaryDenomination = 10;
Which is more informative? The computer doesn't care. 10 is 10.
I remember working my way through five thousand lines of uncommented assembly code one night just to figure out why a carriage return was present in one report option and not in another.
By the eighties everybody was either adopting C or getting ready to adopt C. Go on line and look for FORTRAN and COBOL jobs. See any? Now look for C jobs or their derivatives.
Ritchie was also instrumental in the creating Unix, from which Linux derived. Both are written in C. Using your android phone? Got Linux. Mac? Uses a descendant of the NeXTSTEP operating system, a variant of Unix, written in C. IPhone? Written in C++. Google? Lives in the Cloud. I bet it resides on Linux, written in C and C++.
Yeah, Jobs did a lot of things. But my hat's off to Ritchie. Jobs is like the guy who invented the designer car.
Ritchie invented the wheel.
Tuesday, October 25, 2011
This post was inspired by Buster Blonde's fine post here.
I try not to talk much about my own personal life here.
Essentially, I strongly feel my opinions, work and profession should depend on the work I produce, not the path I took to become who I am that produces it. If I write a story with a black man as the main character I want it to be judged on the quality of the work and not judged that I, a white man, might dare to write about a black character. Similarly, if I write an opinion about how the poor are treated here in the US of A, I want that opinion evaluated whether or not I am poor or rich or somewhere not either.
I also feel that anecdotal evidence is no evidence at all. If we have a statistic-- say, percentage of people who are poor that manage to change their state. We'll say, arbitrarily and with no supporting fact, that 90% of the people who are born poor stay poor. (The statistics are much more interesting. See here. But for the purposes of this discussion we'll say 90%.) Then, someone comes along with an anecdote that such and such (insert name: Herman Cain, Henry Ford, etc.) raised himself out of poverty by dint of his own work ethic. That anecdote is immediately used to counter the statistic.
This is more than bogus. This is called bullshit.
Anecdotal evidence doesn't say anything about circumstances.
My life reflects who I've become but has no relevance to the world at large save that it formed my opinions.
My father was an aerospace engineer. He worked hard. Relentlessly, brutally hard. He had been raised as a farmer-- his family (read: him) plowed the farm with a horse because tractors were for the wealthy. They had no running water, electricity or indoor plumbing. He was critically unprepared for any technical education.
Then, World War II happened.
Dad wanted to be an aviator but had no math skills at all. He had a high school education from a high school that wasn't that good to begin with. He hadn't taken advantage of what had actually been there. He had been rejected by a music college because of insufficient math skills.
He received his draft notice. Dad didn't want to be fodder. He didn't want to die in the bowels of a ship. The rules were that once you received the notice you had a period of time to enlist before you were required to show up. He wanted to be an aviator and tried several times to so enlist, being rejected each time for various reasons. He would solve one problem and then find another. The last one was high school geometry.
Dad was walking down the steps of the courthouse and passed another gentlemen who had the same problem and was explaining to his friend that he'd gotten around it by agreeing to take a geometry course at a local school. Dad turned around on the steps and went back inside with this information and managed to get into aviation school.
Understand, I am enormously proud of my father. He was the sort of man who actually could take a sow's ear and make a silk purse. But, like anyone else, he had to have the luck to get the sow's ear in the first place.
Dad didn't help me in college. Not because he didn't want to but because in 1971 with the oil embargo and such the bottom completely fell out of aerospace and Dad was finished as an aerospace engineer. Eventually, he found his feet but I was on my own.
Which I managed by working two jobs and grabbing every National Defense Student Loan I could find. NDSL: thank you to Lyndon Banes Johnson. Imaging paying a guaranteed 3% loan for your education. NDSL stopped: thanks a lot, Nixon.
If there had been no GI bill, Dad wouldn't have been able to get into college. He would have had to go back to farming or selling pianos or cars or something because his high school education had not prepared him for much else. Did the country need him to do that? If there had been no NDSL, I'd have been screwed. I didn't have the grades to go to college on scholarship. I worked damned hard but so what? Effort does not always equal reward. Did the world need me pumping gas or digging ditches?
Sure I worked hard and made some good decisions but it was national policy and luck that got me here. Commodity Foods fed me that first year in college before I found my feet. And the places I had to live-- well, let's just say I know what I know about dumpster diving from personal experience.
So, does that mean I want my son to have proper instruction in dumpster diving to better prepare him for the workplace? Hell no.
Sure, I think people ought to work hard. But do I think opportunity is available to everyone without help? The statistics don't bear that out.
For my own part, anyone who thinks they did it without help is dreaming. And that's why we have to make sure people have help.
Friday, October 21, 2011
Every now and then my knee jerk bias against Faux News gets slapped.
Sally Kohn published this amazing article on the Occupy Wall Street movement. I was led to it by James Fallows here.
Okay. Take a deep breath. How should I consider this? On the same site as Bill O'Reilly and Sean Hannity, both of which are dumping on Occupy Wall Street in some weirdly desperate attempt to delegitimize it. When I see their opinions it immediately makes me wonder what's going on.
Now I'm really interested.
Some not so much fun.
Thursday, October 20, 2011
(Picture of the failed filibuster is from Ezra Klein's article here.)
The Senate used to work. Really. It actually passed bills, confirmed nominations and moved the gears of government.
Not any more.
Republican tactics of using the filibuster (See here.) have made 60 votes a requirement for passing legislation. When was the last time any party could pull that up without defectors? Answer: never.
He points out that the media is an enabler of this process by not calling a spade a spade. In a mistaken concept of objectivity they blame both parties when, really, only one is doing this at the moment. You can make the argument that if the Democrats were the minority they'd do the same. Except they didn't. Of course, now that the Republicans have paved the way there is no reason that the Dems not do the same next time around.
There's another point that I'd also like to bring up. The media has been touted as left wing by the right for a long time. Forgetting for the moment there's no evidence of this. Forgetting for the moment with the propaganda mills such as Limbaugh and Hannity and Faux News I suspect media that doesn't have a right wing tilt to be in the minority. I argue that with this false equivalence problem, the media is actually aiding and abetting the Republicans.
Not that this surprise me. Follow the money. I don't think that the corporate control of news is explicitly tilting the news-- with the exception of Faux News. But I think that any news that explicitly burns the hand that feeds it is going to have a higher bar than news that doesn't. And this false even handedness is an easy way out.
The weird bicameral hypocrisy of the modern Republican filibuster approach is strange. After all, the filibuster is not enshrined in the Constitution. The two thirds majority (which would be 67, not 60 votes) is narrowly defined: impeachment, member expulsion, overcoming a veto, treaties and amending the Constitution. Don't take my word for it. Go look.
This means that all of the Senatorial rules-- including the filibuster-- are, in effect, laws the Senate has placed upon itself. One could make the argument they fly in the face of the Constitution itself but nowhere in the Constitution is there any reference to the rules by which the houses of government operate.
Which have now changed.
Before the current Republican use of the rules it took a simple majority to pass laws. The filibuster was used in only extraordinary circumstances. Now, it takes a supermajority to pass anything but the most bland and timid of legislation.
Consider this allusion: the ship of state has two teams, both of which are required to man the oars and move forward. It's hard to move forward if one team has decided to burn down the ship to the waterline if it doesn't get its way.
Fun things to read.
Lead in Kid's Drinking Glasses
Wednesday, October 19, 2011
Here are some of the stories I'm interested in.
Tuesday, October 18, 2011
Sunday, October 16, 2011
I've been talking about human evolution and science for a bit now. A few times I've tied it to the writing. But mostly I've left it alone.
I do want to talk about that for a bit today.
One of the things that should leap out at anybody who reads the same material that I do is how incredibly complex the world is. Consider, for a moment, just how many currencies there are: yen, yuan, dollar, shekel, ruble, etc. There are hundreds of languages in use and thousands that exist or have existed. How many cultures can there be? In numbers of cultures, I'd argue that we're at a low point as mass communication tends to homogenize the world. Imagine how many cultures there might have been we will never know.
We think of ancient Egypt as the pharonic culture but it lasted a few hundred human generations and was an international hub. Imagine how many cultures existed there and how they might have changed over those generations. Imagine living with the knock on effects of those unknown cultures without ever having known they existed.
The point is that any world we create, either fantasy or SF, must by definition be as big and complex as the world we live in. One of the interesting things about Lord of the Rings, for example, is how many human species there are: elf, orc, "human", hobbit, dwarf. (For those that think these are different species consider that Aragorn was able to mate with an elf and produce fertile offspring. They're no more different from each other than domestic dogs from wolves.)
What always comes up in my mind when I read these sorts of works is how the politics might work.
Politics is the mechanism by which differing individuals and groups manage to work out their differences, preferably without violence. It is an emergent property of a social and intelligent species. Chimps have politics. Monkeys have politics. Birds have politics. They may not be as complex as ours (excepting chimps: See Frans de Waal's Chimpanzee Politics: Power and Sex among the Apes.) but as soon as you have a group of social animals of rudimentary intelligence, politics rears its head.
Given all of that, why is there only one language in Lord of the Rings? I know that other languages exist. But all of the main characters speak a Common Tongue. That is interesting in and of itself. Where did it come from? There is a political history there and it's not mentioned anywhere in the book. In Rudyard Kipling's Kim everybody speaks the "vernacular", a common patois that derives as much from the English occupation of India as it does from a need for trade among disparate peoples. But at least in Kim there are different languages, cultures, customs. In Lord of the Rings you get the sense that everybody in it is a different class of Englishman without the richness of the English language.
I pick on LOTR because it's part of the common culture of SF and Fantasy. The classic SF is just as bad: Foundation is a bunch of Americans as is pretty much most of Heinlein's work-- not all, fortunately.
Ursula K. Le Guin wrestled with this issue of representing the natural complexity of human society in and The Dispossessed as well as other works. In The Dispossessed, the two worlds were tightly constrained to limit complexity. On Anarres, the world had such scarce resources and the setting so limited that little cultural diversity was shown though much was hinted. On Urras, the viewpoint was even more tightly limited. The nice thing about The Dispossessed is that she let the humans be complex individuals within complex, though limited, environments and then left tantalizing hints that the world had much, much more to offer.
We often forget how incredibly intelligent humans are. In a sea of six billion brains the abilities of an individual brain becomes undervalued just because it is common. In fiction we cannot forget this fact. Humans don't lose their intelligence in groups; they change motivations.
Larry Niven (to the best of my recollection) said in Protector, "intelligence is a tool that is not always used intelligently." Hence we see very bright people doing what we would not consider doing very bright things-- though they often bring intelligence to bear on them.
This is part of the complexity of human behavior. A good fictional world is going to have smart people doing smart things, smart people doing dumb things for smart reasons, dumb people rising to become smart under singular circumstances and smart people banding together and compromising on dumb propositions just because that's all they can agree on.
Friday, October 14, 2011
Gordon's notes quotes a source here suggesting political policy changes are independent of those voting who are not of the 1%. To quote Gordon quoting Ezra Klein of the Washington Post:
... Martin Gilens, a political scientist at Princeton University, has been collecting the results of nearly 2,000 survey questions reaching back to the 1980s, looking for evidence that when opinions change, so too does policy. And he found it—but only for the rich. Policy changes with majority support didn’t become law except when that majority support included voters at the top of the income distribution. When the opinions of the poor diverged from the opinions of the rich, the opinions of the poor did not appear to matter. If 90 percent of the poor supported a policy change, its chances of passage were no better than if 10 percent of the poor supported it...
I wish I could say I was surprised.
Tuesday, October 11, 2011
Friday, October 7, 2011
Wednesday, October 5, 2011
Tuesday, October 4, 2011
China released a video of the launch and animation of the orbiting Tiangong-1 to the tune of... America the Beautiful.
I prefer to believe this was not a mistake but an honest homage from the aspiring leader of space exploration to its sad older brother beginning to fall behind.
Monday, October 3, 2011
I like model rockets. There are a bunch of sites devoted to model rocketry but here are some special ones:
Air Command Rockets: professional quality water rockets. How to and getting started.
SS67B-3: Liquid Fuel Rocket Engine
LDRS: Large and Dangerous Rocket Society. Here's 2011's year's festival.
Tripoli Rocketry Association: One of the LDRS participants
Ball Rocket Festival, Black Rock: The big rocket festival for people who want to launch over 50k feet. Here's an example.
CMASS: for those who are local to Massachusetts
Further links can be found on my website here.
Sunday, October 2, 2011
Okay. This is a bit of a roller coaster ride.
In this corner, we have Australopithecus. Australopithecus evolved up about 4 million years ago and became eventually extinct about 2 million years ago. Australopithecus is considered a hominid, in that he is part of the great apes: those apes that include us. Lucy was an Australopithecus.
Australopithecus is not normally considered a hominan, members of the human tribe and their extinct relatives. Pity these terms are so close but there it is.
In the other corner we have Homo habilis. H. habilis is of the genus Homo and lived from 2.3 to 1.4 million years ago. H. habilis has been associated with stone tools and may have had fire. He was about 1/2 the cranial capacity of a human. H. habilis is often considered the first on the direct line to humans.
This is the way we've considered it for some time. Now it's time to throw in a couple of monkey wrenches.
Enter Australopithecus sediba.
Au. sediba was discovered in 2008 and two specimens have been dated to between 1.977 and 1.980 million years ago. The dates are particularly precise because of the sediment they were buried in contained Uranium salts from which accurate dates could be determined. (See here.)
The dates are important since it's been determined that one of the drivers of human evolution have been climactic fluctuation (see here.) High variability started around 2.7 million years ago. By 1.5 million years the last hominan standing is Homo erectus.
Now, Au. sediba turns out to have many pre-adaptations to be human. They have hands that are closer to ours than Homo habilis. The pelvis more closely resemble ours and suggests that it used a form of bipedal walking. (See here, here and here.) The ankle looks human like but the heel looks like that of an ape.
The pelvis is particularly interesting since Au. sediba's brain isn't particularly big. However, the pelvis is upright and resembles a human pelvis, doing damage to the idea that the human pelvis got its shape because it had to cope with birthing a big human skull.
The pelvis is particularly interesting since Au. sediba's brain isn't particularly big. However, the pelvis is upright and resembles a human pelvis, doing damage to the idea that the human pelvis got its shape because it had to cope with birthing a big human skull.
Most importantly, Au. sediba's brain more closely resembles modern human brains than H. habilis.
How do we know this? you ask.
The brain closely fits in the skull cavity. By looking carefully (see here) at the skull cavity the shape of the brain can be deduced. Which is what these scientists did using precise X-ray micro-tomography.
Humans have a brain about 4 times the size of a chimp's. H. habilis' brain is about 1/2 ours and therefore about 2x on a chimp. Au. sediba's brain is barely 40 cubic centimeters bigger than a chimps. Not much at all.
However-- and this is a big however-- Au. sediba's brain is organized and shaped more like a human brain than either a chimp brain or one of H. habilis. (See here.) The frontal region and olfactory regions were similar to modern human.
First, these specimens have to be studied a lot more to figure out how they are to be placed in the heritage of Homo sapiens.
But there are a lot of interesting things going on here. I've talked before about evolutionary pre-disposition. Evolution takes advantage of the way things are rather than what things could be. Consequently, the hands and ankle of Au. sediba predispose him towards us. It gives him a leg to stand on, as it were.
I would guess that the brain of Au. sediba was not as good a cognitive engine as H. habilis. After all, it's 1/2 the size. In brains, size matters but organization matters more. Even so, while Au. sediba had a better organized brain I doubt it was as smart as H. habilis. Think of Au. sediba as a finely tuned four cylinder engine compared to the hulking V8 of H. habilis. The little engine is built like a Swiss watch but still can't crank out the raw torque of the inefficient V8.
The difference, though, is scalability. The brain of Au. sediba had the organization to expand. Quoting the abstract here regarding this organization: "... are consistent with gradual neural reorganization of the orbitofrontal region in the transition from Australopithecus to Homo, but given the small volume of the MH1 endocast, they are not consistent with gradual brain enlargement before the transition."
That is, the Au. sediba brain organization was pre-disposed to grow better as it grew bigger.
That's one idea. Faye Flam thinks the two species might have been friends with benefits, blending the best of both species. Which does damage to the idea they were separate species to begin with.
This is going to be fun.
This is going to be fun.