What Educators Should Know About Biden’s ‘Artificial Intelligence Bill of Rights’


Track student progress and report children at risk of failure. Further customization of lessons to meet individual student needs. Professional development tailored to individual educators. Automatic grading of the essay.

These are just some of the tasks in K-12 education that experts say can – or really are –It is carried out with the help of aRatrapani Iintelligence. AI is already transforming retail, agriculture, pharmaceuticals, and other industries. Its impact on education is expected to grow only from kindergarten through the end of secondary education, along with almost everything else in the economy.

With this in mind, the White House released a file Amnesty International’s Bill of Rights advance this month. Here are some important facts that educators should know about this topic.

The AI ​​Bill of Rights is built around five principles

You must be protected from insecure or ineffective systems. This means, among other things, that AI systems need to be tested before they are rolled out, and then carefully monitored to make sure they work as intended.

No one should face discrimination by algorithms. Systems must be used and designed in a fair manner. AI systems reflect the biases of the people who program them. This is why, for example, an algorithm designed to decide who gets a financial loan may unintentionally harm black borrowers. take People from different backgrounds design AI-powered systems It is one possible solution.

You must have protection against abusive data practices and agency over how data about you is used. AI systems rely on data, and student data privacy is clearly a major issue in any AI-powered technology.

You should know that an automated system is used and understand how and why it affects the outcomes that affect you. K-12 schools can play a big role here too, in Helping students understand technology and how it affects the world around them.

You should be able to opt out, where appropriate, and reach out to someone who can quickly think through and solve the problems you’re having. This would seem to mean that companies creating educational programs using artificial intelligence must respond quickly to any problems raised by teachers or parents.

AI Guidelines have no real legal authority

The Bill of Rights is simply a guideline for areas of the economy that rely on artificial intelligence, although this is increasingly happening in almost every area of ​​the economy. If anything, its principles may apply to the use of artificial intelligence by the federal government, According to an analysis in Wired. But it won’t force Facebook, Netflix, or even the state’s criminal justice system to make changes to the way they use artificial intelligence, unless they voluntarily decide to embrace the principles.

The US Department of Education is one of a number of agencies that are supposed to pursue the Bill of Rights. Specific recommendations for the use of AI in teaching and learning are expected by early 2023. The recommendations should include guidelines to protect the privacy of student data when using AI.

What data privacy experts see as a problem

Amelia Vance, Public Interest Privacy Consulting and an expert on schools and data privacy, thought the general content of the document was correct, but questioned the extent of the White House outreach to K-12 education groups, given some of the examples used in the principles Steering.

For example, when explaining data privacy, the document said that ideally, data should be more accessible to those who work directly with the people to whom the data relates. It gives, for example, that the teacher has more access to the data of his students than the supervisor.

“There are many school districts that have decided they want the supervisor or principal to have access and be able to see across schools [how] “Teachers serve their students,” Vance said. “It raises some really serious questions again about who they talked to” in making recommendations for K-12.

Furthermore, it may not be practical for schools to always obtain parental permission before allowing students to use learning technology that is partly based on artificial intelligence. But this is how some might interpret the guidelines.

“I think it’s largely the same reason because many supervisors and teachers struggle with parents who want to be able to individually agree to what reading their child should do,” she said, referring to the push by parents in some communities to review curriculum materials from Before they are used with students. This is often impractical. It is difficult for teachers to build their own curriculum. It is difficult for the school to move forward to make sure that everyone is learning the same things and that the learning is provided in an equitable way.”

What do people think of companies that create tools for student learning?

Having guidelines can be helpful for businesses, especially those that want to reassure schools that they will protect data and eliminate bias.

“If someone wants to build an AI system, there are some great firewalls to help you build a better one,” said Patricia Scanlon, founder and CEO of SoapBox Labs. Natural language processing technology is specifically designed for children’s voices and is used in educational products developed by McGraw Hill and other companies.

Like other international companies, SoapBox Labs, which is based in Ireland, will have to comply with pending European AI guidelines, which may be stricter. Moreover, unlike the White House Artificial Intelligence Rights Act, these guidelines may come with an enforcement mechanism.

Earlier this month, SoapBox Labs became the first company to receive Prioritize racial equality in the adoption of AI design productsIt was developed by two education nonprofits, the Digital Promise and the EdTech Equity Project.

Scanlon added that school districts may feel more comfortable using certain products if an external assessor confirms that they meet certain standards to mitigate privacy and bias. “It can give some confidence, so not everyone has to be an AI expert,” she said. “I think the stakes in education are only higher than they are for your Netflix recommendation,” which could also be driven by artificial intelligence algorithms.


Leave a Reply

Your email address will not be published. Required fields are marked *