Scroll To Top
Schools Guide Home Page

Back to School Blog

Each week, the coaches from Milestone Academic Counseling offer timely advice for high school students in the Princeton area.

Print this Page

Most recent posting below. See other blog postings in the column to the right.

Michaela Guo -- Princeton High School -- 11th Grade

This summer, I learned again and again the power of inclusion.

It's relatively easy to feel comfortable in a room where virtually everyone looks like you. But once you step into a room where there's nobody who looks like you, you start to get uncomfortable and a little less confident, feeling the pressure of representing everyone who looks like you. And it's easy to exit because you feel like you went into the wrong room.

AI4ALL is a summer program that works to change that by increasing diversity in artificial intelligence. For three weeks, I lived with 31 other rising juniors from all across the country. We explored neural networks and various machine learning algorithms, applying what we learned in groups to four real-world applications: computer vision, natural language processing, the Internet of Things, and the Fragile Families Challenge(using big, complicated, messy data, predicting certain aspects of a child's future, and improving the lives of disadvantaged children). It was amazing to be surrounded by people from vastly different backgrounds who all connected through an enthusiasm to learn.

Most fascinating for me was the heavy emphasis on ethics surrounding AI and its social impact, discussing accountability issues, privacy risks, and the consequences of a lack of inclusion. If the developers of technology are of a select group, then the technology will only properly serve that set of people because the creators simply will not have the perspectives (and thus won't notice needs) that a diverse set of people will. AI is at the intersection of all fields; it’s applicable everywhere from health to law. So if it's learning to be biased, what will happen to all the decisions we trust it to make? It seemed to me like robots couldn't really have prejudice — they're programmed machines, after all. But already, they show bias; take, for example, a "beauty contest robot" that learned off of training data to prefer lighter-skinned models. And with automation and globalization advancing so rapidly, when we don't provide everyone with access and opportunity to understand, use, and develop the technology that shapes our world, then we exacerbate economic and social disparities. The machines of our future will reflect the biases of the current creators; we have to think about not just who technology is working for, but who it can and will work against.

I learned to be more cognizant of the risks and ethical issues that always come with the benefits of any technology. The threats of “Terminator” takeovers are not those we should be most concerned about when it comes to machine learning and technology; we should be questioning the massive economic, social, and political impact technology has. Purposefully including those that often get pushed out of or don't even get access to a field like AI is so so vital. We shape the technology that will shape us and our future, so we need the diversity of perspective and background in every step of development — something that can often only be achieved through conscious inclusion.

The program:

Moderated by Jake Cornelius.

Add a Comment

Other Blogs

Making Change A Conversation with Iconic Civil Rights Activist Dolores Huerta

Marching with the Band in HS

Cuba Expedition: Interacting with Horticulturists

High school is a journey.

Aaron Silverstein -- Princeton High School -- 11th grade

2018 Dec Today
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31