30 May 2008
26 May 2008
Boston University doctoral student Joan Nash, who has used American Sign Language (ASL) for most of her life, is part of a team working on an interactive video project that would create a virtual sign language dictionary, allowing someone to demonstrate a sign in front of a camera and have a computer program interpret and explain its meaning. The researchers are working with a three-year, $900,000 grant from the National Science Foundation, and are currently in the early stages of the project, which involves capturing thousands of ASL signs on video. As Nash goes through the hundreds of words in English, Elizabeth Cassidy, a native ASL speaker, signs them in front of four different cameras, three in front of her and one to her right. Two of the cameras in front of her capture close-ups from different angles and one is a wider shot. The goal is to develop a database of more than 3,000 signs, with the meaning of each sign being determined by the shape of the hands, the movement of the hands and arms, and even facial expressions. Eventually, the researchers hope the technology will be used to develop a multimedia ASL dictionary to help hearing parents better communicate with deaf children and to help sign language students.
I've often wondered if groups of people who sign, say from a particular community or geographic location, have an "accent" when they sign? Can a person signing have a "twang" or a "drawl" or something akin to a Boston accent? For instance, German telegraph operators during WWII were sometimes known to have a certain "hand" or touch to their transmission that those intercepting the transmission in Bletchley Park could identify as belonging to a certain operator. Does the same stylization or accent occur in signing?
11 May 2008
This is pretty cool. All I had growing up was the Cub Scout's annual Pinewood Derby.
From the US site:
What is F1 in Schools?
It's a competition for teams of three to six school children to design and manufacture miniature CO2-powered racing cars and then race them at regional, national, and international levels. Sounds simple? Not when you consider that these 11- to 18-year old kids use state-of-the-art software programs that enable them to play around with CAD (computer aided design) and CFD (computational fluid dynamics), just like real F1 designers.
Or that they have to manage the whole project from scratch, from drawing up a business plan and raising the sponsorship, to financing it through the design and manufacturing stages, to a presentation in front of a panel of preeminent judges.
10 May 2008
I spent a few hours at the weekend viewing/listening to a series of presentations to accompany the launch of the Information Security Awareness Forum (ISAF) in London. I won't bore you with all the details right now but one item in particular caught my eye/ear. One of the presenters essentially said that security awareness doesn't work, a somewhat curious point to make in support of a security awareness initiative. Anyway, it's not the first time I've heard the argument and I've been mulling it over ever since. My blood having dropped just below boiling point, it's time to respond.
Today I took one of those "online security awareness" things, and came away with a whole case study on How NOT To Do security awareness. I shan't name the organization concerned because my aim is not to embarrass them in any way, and it really doesn't matter - I'm sure these lessons are equally valid for many other security awareness programs.. . .
(I cut all of the meat out for the sake of space but it's all pithy observation in support of the title of the post)
. . .
OK OK I'm ranting I know, but the reason is to point out that:I cannot understand why security awareness seems to be stuck in the mold of once-a-year inform-and-test (I used to call it the "sheep dip" approach to awareness, but subsequently found out that sheep are dipped more often than most employees are made to jump through the awareness hoops!). It's high time for a new approach and some fresh ideas.
(a) with little investment and even less thought, security awareness can be done really badly;
(b) bad security awareness is unlikely to be effective, and in fact could be counterproductive;
(c) the ineffectiveness of badly designed, constructed and delivered awareness programs says nothing about the potential for well designed, well constructed and effectively delivered programs; and
(d) it really doesn't take a genuis to figure out how to improve security awareness, especially when starting from such a low base. A 20 minute team seminar about information security would have achieved so much more than this hour or two of extreme tedium. Almost ANYTHING else would have been better!
OK, that doesn't sound unreasonable, right? He seems certainly in favor of "proper" education and a continuous cycle of verification, yes? Why would I be commenting at all? Well, I'm commenting because years ago, Marcus Ranum noted in a rather pithy commentary titled, "The Six Dumbest Ideas in Computer Security", that (#5 - Educating Users):
"Penetrate and Patch" can be applied to human beings, as well as software, in the form of user education. On the surface of things, the idea of "Educating Users" seems less than dumb: education is always good. On the other hand, like "Penetrate and Patch" if it was going to work, it would have worked by now. There have been numerous interesting studies that indicate that a significant percentage of users will trade their password for a candy bar, and the Anna Kournikova worm showed us that nearly 1/2 of humanity will click on anything purporting to contain nude pictures of semi-famous females. If "Educating Users" is the strategy you plan to embark upon, you should expect to have to "patch" your users every week. That's dumb.
The real question to ask is not "can we educate our users to be better at security?" it is "why do we need to educate our users at all?" In a sense, this is another special case of "Default Permit" - why are users getting executable attachments at all? Why are users expecting to get E-mails from banks where they don't have accounts? Most of the problems that are addressable through user education are self-correcting over time. As a younger generation of workers moves into the workforce, they will come pre-installed with a healthy skepticism about phishing and social engineering.
Dealing with things like attachments and phishing is another case of "Default Permit" - our favorite dumb idea. After all, if you're letting all of your users get attachments in their E-mail you're "Default Permit"ing anything that gets sent to them. A better idea might be to simply quarantine all attachments as they come into the enterprise, delete all the executables outright, and store the few file types you decide are acceptable on a staging server where users can log in with an SSL-enabled browser (requiring a password will quash a lot of worm propagation mechanisms right away) and pull them down. There are freeware tools like MIMEDefang that can be easily harnessed to strip attachments from incoming E-mails, write them to a per-user directory, and replace the attachment in the E-mail message with a URL to the stripped attachment. Why educate your users how to cope with a problem if you can just drive a stake through the problem's heart?
When I was CEO of a small computer security start-up we didn't have a Windows system administrator. All of the employees who wanted to run Windows had to know how to install it and manage it themselves, or they didn't get hired in the first place. My prediction is that in 10 years users that need education will be out of the high-tech workforce entirely, or will be self-training at home in order to stay competitive in the job market. My guess is that this will extend to knowing not to open weird attachments from strangers.
Heh. So there it is. In a previous post, I commented on my previous role as an internal InfoSec consultant to a higher education institution. The way I tried to bridge the gap of parochial or specialized knowledge was this:
A few years ago, I led a team of network security staff at a private New England university. One thing I stressed was collaboration with peer groups, visibility to higher decision-makers and a decidedly NON-jackboot thug approach toward requests and assistance; we were to be in the business of analysis of needs (perceived and actual) and distilling them to appropriate security controls that could best support them. I doubt I was successful in this approach as my group largely functioned without mandate but I still to this day try to keep in mind a message I pushed to my staff and to the groups I met with:
I may not know much about medical imaging or financial aid records or your particular area of expertise in computer science, biology, music, etc. What I do know a bit about is data protection and security. We meet and there is a disconnect between us. What is important to you as a researcher or faculty member? What is important to me as a staff member charged with protecting you and your data?
DNA sequencing, firewalls, intellectual property...all of this reduces to knowing and working with your constituents, addressing their needs, listening to their concerns and presenting a common, organizationally based (re: consistent) to risk management and data protection that the groups you ultimately serve can do so in a consistent manner while (hopefully) taking a risk-based approach to assessment, mitigation and remediation.
Is there any middle ground on the topic of user education, with regard to information security concerns? Is it black or white like Hinson or Ranum argue or is there some middle, moderate ground that could work?