Recent thoughts

Note:

At present, I write here infrequently. You can find my current, regular blogging over at The Deliberate Owl.

I used to write papers in a very linear fashion.

I remember a paper for my intro cog sci class freshmen year, struggling to compose a decent introductory paragraph. I despaired over my first sentence. Transitions between themes, arguments, and discussions of evidence caused me agony. Even if I had all my research lined up, I couldn't write a later part because I hadn't written the part before it yet! A paper was a series of logical steps: How could I possibly know how best to start a paragraph without knowledge of the sentence prior?

_laptop, piles of printed papers, a robot programming text, a highlighter, a flash drive and a pen_

My style has changed dramatically. Now, I write bits and pieces. If I know I'll be including a paragraph summarizing the work of a particular researcher, great, I can write that and have it ready when I need it. I construct bare-bones outlines, filling in details where I think they'll fit, making notes to myself of what need fleshing out and which sections are ready to go. Text gets moved around. Cut-paste. If I don't know how I'm supporting a particular argument yet, I can move on to what I do know and come back to the troublesome bit later. What I write doesn't have to be perfect the first time through.

I don't think know whether my new method saves me time. But I certainly feel more productive: I'm typing, even if I revise previously written paragraphs more frequently. Not expecting my writing to be perfect at the outset means I get more written down, which gives me more material to work with in my later revisions. I'm not staring at the screen, hesitantly trying out possible phrases, becoming the best friend of my "delete" key.

The blinking cursor at the top of a blank page is no longer my perpetual nemesis.


2 comments

Genes: predictor of academic ability?

I found an article today about British researchers who are analyzing the DNA of 4000+ schoolchildren with the goal of finding a relation between the kids' genes and their academic abilities.

_dna strand (credit: ynse on flickr)_

The reason I bring this up is not because the researchers found a gene to explain why you failed your math test, but because the article falls heavily into the "nature vs. nurture" trap. For those of you unfamiliar, nature vs. nurture is the debate over the relative importance of innate qualities built-in from the chromosomes ("nature"), versus personal experiences, environment, and upbringing ("nurture") in determining individual physical and behavioral differences. Really, it shouldn't be a debate: organisms' traits are a result of the interaction of what they start with and where they grow up: nature and nurture. The context in which any organism develops is remarkably important in determining which genes are expressed and how they interact to produce behavior.

Back to the article. There's one paragraph in particular that gets me:

"Research into height, for example, has picked out 300 genes that affect how tall people will grow, but even these genes can only explain 15% of the total variations in human height. It implies that hundreds more genes must also play a part."

No, that's a false choice. What's implied is that there could be other genes involved, but - and here's a novel thought - maybe the environment (e.g., nutrition) plays a role? A little bitsy part? Maybe?

A little googling:

In hopes that it was just the reporters who were being deterministic, not the researchers themselves, I set out to find more information.

Robert Plomin of King's College, London, is the behavioral geneticist cited in the article. He's currently performing a huge study of British twins. I've found several articles stating that he's a "pioneer in bringing nature and nurture together," and instead of calling it a "nature vs. nurture" debate, he's said to have call it (much more appropriately) "nature and nurture." That's reassuring. I'd have to read a few of his papers to be certain, but my interim conclusion is that it's just the reporters.

If you're interested, I also recently came across a popular article on the gender myth and genetic differences in men and women. It happens to cite Robert Plomin, too.


3 comments

So, the Grace Hopper Celebration of Women in Computing is in about a week.

_tan-brown bookbag with Grace Hopper Celebration '08 logo on the front pocket, laying on a green carpet_

I first attended the conference two years ago, the year the free bookbags were tan-brown canvas, bright poster graphic plastered over the front, black adjustable strap, not enough pockets.

The appearance of the bag is important.

You see, last year, I missed the conference because I was studying abroad in Australia. On a friend's recommendation, my GHC bookbag was the bag I'd taken with me down under for carting notes and texts across campus. My friend said, maybe it'll work as a conversation starter!

The scene: Day Two of international student orientation. The crowd of 18 to 24-year-olds, hailing from every country you can name and probably a few you can't, was in mass exodus from a lecture hall to a large space in the Wentworth building, intrigued by the notion of morning tea. As I was walking across the footbridge to Wentworth, a young woman came up to me.

"Were you in Colorado for the conference this past year?" She pointed to my bag. I couldn't help but grin, of course: there I was, halfway around the globe, and I had a pre-made connection to someone! It turned out we were in the same Number Theory & Cryptography course, too; it was thanks to our six-hour-a-day study marathons that I vanquished the final exam.

A conversation starter, indeed.

Needless to say, I'm looking forward to this year's conference!


3 comments

group shot of nine interns and Garry (one intern, Leo, is not pictured) in front of blimps, holding quadcopters and shiny cars

In the summer of 2010, I interned at NASA Langley Research Center in the Langley Aerospace Research Summer Scholars Program.

My lab established an Autonomous Vehicle Lab for testing unmanned aerial vehicles, both indoors and outdoors.

Overview

I worked in the Laser Remote Sensing Branch of the Engineering Directorate under mentor Garry D. Qualls. There were nine interns besides me - here's the full list, alphabetically:

  • Brianna Conrad, Massachusetts Institute of Technology
  • Avik Dayal, University of Virginia
  • Michael Donnelly, Christopher Newport University
  • Jake Forsberg, Boise State University
  • Amanda Huff, Western Kentucky University
  • Jacqueline Kory, Vassar College
  • Leonardo Le, University of Minnesota
  • Duncan Miller, University of Michigan
  • Stephen Pace, Virginia Tech
  • Elizabeth Semelsberger, Christopher Newport University

several quadcopters stacked up in a pile

Our project's abstract

Autonomous Vehicle Laboratory for "Sense and Avoid" Research

As autonomous, unmanned aerial vehicles begin to operate regularly in the National Airspace System, the ability to safely test the coordination and control of multiple vehicles will be an important capability. This team has been working to establish a autonomous vehicle testing facility that will allow complex, multi-vehicle tests to be run both indoors and outdoors. Indoors, a commercial motion capture system is used to track vehicles in a 20'x20'x8' volume with sub-millimeter accuracy. This tracking information is transmitted to navigation controllers, a flight management system, and real-time visual displays. All data packets sent over the network are recorded and the system has the ability to play back any test for further analysis. Outdoors, a differential GPS system replaces the functionality of the motion capture system, allowing the same tests to be conducted as indoors, but on a much larger scale.

Presently, two quadrotor helicopters and one wheeled ground vehicle operate routinely in the volume. The navigation controllers implement Proportional-Integral-Derivative (PID) control algorithms and collision avoidance capabilities for each vehicle. Virtual, moving points in the volume are generated by the flight management system for the vehicles to track and follow. This allows the creation of specific flight paths, allowing the efficient evaluation of navigation control algorithms. Data from actual vehicles, virtual vehicles, and vehicles that are part of hardware in the loop simulations are merged into a common simulation environment using FlightGear, an open source flight simulator. Evaluating the reactions of both air and ground vehicles in a simulated environment reduces time and cost, while allowing the user to log, replay and explore critical events with greater precision. This testing facility will allow NASA researchers and aerospace contractors to address sense and avoid problems associated with autonomous multi-vehicle flight control in a safe and flexible manner.

Articles and other media

In the media

On my blog

Videos

Most of the summer was spent developing all the pieces of software and hardware needed to get our autonomous vehicle facility up and running, but by the end, we were flying quadcopters! (Captions are below their corresponding videos.)

Credit for these videos goes to one of my labmates, Jake Forsberg.

Object tracking for human interaction with autonomous quadcopter

Object tracking for human interaction with autonomous quadcopter: Here, the flying quadcopter is changing its yaw and altitude to match the other object in the flight volume (at first, another copter's protective foam frame; later, the entertaining hat we constructed). The cameras you see in the background track the little retro-reflective markers that we place on objects we want to track -- this kind of motion capture systems is often used to acquire human movement for animation in movies and video games. In the camera software, groups of markers can be selected as representing an object so that the object is recognized any time that specific arrangement of markers is seen. Our control software uses the position and orientation data from the camera software and sends commands to the copter via wifi.

Autonomous sense and avoid with AR.Drone quadcopter

Autonomous sense and avoid with AR.Drone quadcopter: The flying copter is attempting to maintain a certain position in the flight volume. When another tracked object gets too close, the copter avoids. We improved our algorithm between the first and second halves of this video. Presently, only objects tracked by the cameras are avoided, since we have yet to put local sensors on the copters (the obstacle avoidance is done using global information from the camera system about all the objects' locations).

Autonomous quadcopter tracking and following a ground vehicle

Autonomous quadcopter tracking and following a ground vehicle: The flying copter is attempting to maintain a position above the truck. The truck was driven manually by one of my labmates, though eventually, it'll be autonomous, too.

Virtual flight boundaries with the AR.Drone and the Vicon motion capture system

Virtual flight boundaries with the AR.Drone and the Vicon motion capture system: As a safety precaution, we implemented virtual boundaries in our flight volume. Even if the copter is commanded to fly to a point beyond one of the virtual walls, it won't fly past the walls.

Hardware-in-the-loop simulation

Hardware-in-the-loop simulation: Some of my labmates built a hardware-in-the-loop simulation with a truck, and also with a plane. Essentially, a simulated environment emulates sensor and state data for the real vehicle, which responds as if it is in the simulated world.


0 comments

_My labmates, our mentor, our vehicles, and I_

On the last day of my LARSS internship, NASA EDGE filmed my lab for their Future of Aeronautics episode! It's currently up on NASA's main page in the "Podcasts and Vodcasts" section, and it's available both online and through iTunes. The opening montage has clips of my labmates and I, and the segment about our work starts at 19:18 and lasts three minutes.

I encourage you to take a look!


0 comments