Translate into a different language

Sunday, June 25, 2017

Know the Difference Between AI, Machine Learning, and Deep Learning | Edgy Labs (blog)

AI is defined by many terms that crop up everywhere and are often used interchangeably. Read through to better know the difference between AI, Machine Learning, and Deep Learning.


Photo: Ktsdesign | shutterstock.com
Photo: Zayan Guedim
"Artificial Intelligence is, locally, a computer algorithm tasked with solving input problems based on accessible data and operational parameters, with respect to the amount of computational power available to the algorithm. More generally, AI is the name given to machine intelligence." inform Zayan Guedim, Author at Edgy Labs.

With the vast field of AI are specific concepts like machine learning and deep learning.

In the same way as Russian Matryoshka dolls where the small doll is nested inside the bigger one, each of the three segments (Deep Learning, ML and AI) is a subset of the other. Advances in these three technologies are already revolutionizing many aspects of modern life, and although very much related, they are not the same.

In this post, we’ll begin with the biggest doll “AI” and work our way down to the smallest.
 
Know the Difference Between AI, Machine Learning, and Deep Learning:
Read more... 

Source: Edgy Labs (blog)


If you enjoyed this post, make sure you subscribe to my Email Updates!

The Real Threat of Artificial Intelligence | New York Times - Sunday Review

Photo: Kai-Fu Lee
"It’s not robot overlords. It’s economic inequality and a new global order" notes Kai-Fu Lee, chairman and chief executive of Sinovation Ventures, a venture capital firm, and the president of its Artificial Intelligence Institute.

Photo: Rune Fisker

Too often the answer to this question resembles the plot of a sci-fi thriller. People worry that developments in A.I. will bring about the “singularity” — that point in history when A.I. surpasses human intelligence, leading to an unimaginable revolution in human affairs. Or they wonder whether instead of our controlling artificial intelligence, it will control us, turning us, in effect, into cyborgs.

These are interesting issues to contemplate, but they are not pressing. They concern situations that may not arise for hundreds of years, if ever. At the moment, there is no known path from our best A.I. tools (like the Google computer program that recently beat the world’s best player of the game of Go) to “general” A.I. — self-aware computer programs that can engage in common-sense reasoning, attain knowledge in multiple domains, feel, express and understand emotions and so on.

This doesn’t mean we have nothing to worry about. On the contrary, the A.I. products that now exist are improving faster than most people realize and promise to radically transform our world, not always for the better. They are only tools, not a competing form of intelligence. But they will reshape what work means and how wealth is created, leading to unprecedented economic inequalities and even altering the global balance of power.

It is imperative that we turn our attention to these imminent challenges.

What is artificial intelligence today? Roughly speaking, it’s technology that takes in huge amounts of information from a specific domain (say, loan repayment histories) and uses it to make a decision in a specific case (whether to give an individual a loan) in the service of a specified goal (maximizing profits for the lender). Think of a spreadsheet on steroids, trained on big data. These tools can outperform human beings at a given task.

This kind of A.I. is spreading to thousands of domains (not just loans), and as it does, it will eliminate many jobs. Bank tellers, customer service representatives, telemarketers, stock and bond traders, even paralegals and radiologists will gradually be replaced by such software. Over time this technology will come to control semiautonomous and autonomous hardware like self-driving cars and robots, displacing factory workers, construction workers, drivers, delivery workers and many others.

Unlike the Industrial Revolution and the computer revolution, the A.I. revolution is not taking certain jobs (artisans, personal assistants who use paper and typewriters) and replacing them with other jobs (assembly-line workers, personal assistants conversant with computers). Instead, it is poised to bring about a wide-scale decimation of jobs — mostly lower-paying jobs, but some higher-paying ones, too.

This transformation will result in enormous profits for the companies that develop A.I., as well as for the companies that adopt it. Imagine how much money a company like Uber would make if it used only robot drivers. Imagine the profits if Apple could manufacture its products without human labor. Imagine the gains to a loan company that could issue 30 million loans a year with virtually no human involvement. (As it happens, my venture capital firm has invested in just such a loan company.)

We are thus facing two developments that do not sit easily together: enormous wealth concentrated in relatively few hands and enormous numbers of people out of work. What is to be done?
Read more...

Source: New York Times


If you enjoyed this post, make sure you subscribe to my Email Updates!

Is Artificial Intelligence Overhyped in 2017? by Quora, Contributor | HuffPost

AI over-hyped in 2017? originally appeared on Quora - the place to gain and share knowledge, empowering people to learn from others and better understand the world.




Answer by Joanne Chen, Partner at Foundation Capital, on Quora:

To quote Bill Gates “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten. Don’t let yourself be lulled into inaction.”

In short, over the next ten years, I don’t believe AI will be overhyped. However, in 2017, will all of our jobs be automated away by bots? Unlikely. I believe the technology has incredible potential and will permeate across all aspects of our lives. But today, my sense is that many people don’t understand what the state of AI is, and thus contribute to hype.

So what can AI do today?

Artificial intelligence, a concept dating back to the 50s, is simply the notion that a machine can performance tasks that require human intelligence. But AI today is not what the science fiction movies portray it to be. What we can do today falls in the realm of narrow AI (vs general intelligence), which is the idea that machines can perform very specific tasks in a constrained environment. With narrow AI, there are a variety of techniques that you may have heard of. I’ll use examples to illustrate differences.

Let’s say you want to figure out my age (which is 31).

1) Functional programming: what we commonly know as programming, a way to tell a computer to do something in a deterministic fashion. I tell my computer that to compute my age, it needs to solve AGE = today’s date – birth date. Then I give it my birth date (Dec 4, 1985). There is 0% chance the computer will get my age wrong.

2) Machine learning: an application of AI where we give machines data and let them learn for themselves to probabilitically predict an outcome. The machine improves its ability to predict with experience and more relevant data. So take age for example. What if I had 1,000 data sets of people’s ages and song preferences? Song preference is highly correlated with generation. For example, Led Zeppelin and The Doors fans are mostly 40+ and Selena Gomez fans are generally younger than 25. Then I could ask the computer given that I love the Spice Girls and Backstreet Boys, how old does it think I am? The computer then looks at these correlations and compares it with a list of my favorite songs to predict my age within x% probability. This is a very simple example of using machine learning..

3) Deep Learning: is a type of machine learning emerged in the last few years, and talked widely about in the media when Google DeepMind’s AlphaGo program defeated South Korean Master Lee Se-dol in the board game Go.

Deep learning goes a step further than ML in that it enables the machine to learn purely by providing examples. In contrast, ML requires programmers to tell the computer what it should look for. As a result, deep learning functions much more like the human brain. This especially works well with applications like image recognition.
Read more...

Source: HuffPost


If you enjoyed this post, make sure you subscribe to my Email Updates!

Beware the Hype of Artificial Intelligence | Fortune - Tech

Photo: Jonathan Vanian
"Artificial intelligence has made great strides in the past few years, but it’s also generated much hype over its current capabilities. That’s one takeaway" reports Jonathan Vanian, writer at Fortune with a focus on technology.

Photo: Getty Images

That’s one takeaway from a Friday panel in San Francisco involving leading AI experts hosted by the Association for Computing Machinery for its 50th annual Turing Award for advancements in computer science.

Michael Jordan, a machine learning expert and computer science professor at University of California, Berkeley, said there is “way too much hype” regarding the capabilities of so-called chat bots. Many of these software programs use an AI technique called deep learning in which they are “trained” on massive amounts of conversation data so that they learn to interact with people.

But despite several big tech companies and new startups promising powerful chat bots that speak like humans when prodded, Jordan believes the complexity of human language it too difficult for bots to master with modern techniques like deep learning. These bots essentially perform parlor tricks in which they respond with comments that are loosely related to a particular conversation, but they “can’t say anything true about the real world.”

“We are in era of enormous hype of deep learning,” said Jordan. Deep learning has the potential to change the economy, he added, but “we are not there yet."

Also in the panel, Fei-Fei Li, Google’s (goog, +0.89%) machine learning cloud chief and Stanford University Professor, said “We are living in one of the most exciting and hyped eras of AI.” Li helped build the ImageNet computer-vision contest, which spurred a renaissance in AI in which researchers applied deep learning to identify objects like cats in photos.

But while everyone talks about ImageNet’s success, “we hardly talk about the failures,” she said, underscoring the hard work researchers have building powerful computers that can “see” like humans.
Read more... 

Source: Fortune


If you enjoyed this post, make sure you subscribe to my Email Updates!

Saturday, June 24, 2017

Six things to know about network connectivity in Africa | IDG Connect

Photo: Kathryn Cave
Kathryn Cave, Editor at IDG Connect summarizes, "A special summit collocated with Datacloud Europe 2017 addressed datacentres in Africa."

Photo: IDG Connect

Since the first undersea cables began to connect Africa in the early 2000s, a network of fibre has slowly grown to surround the continent. However, what exists at the edge does not necessarily make its way to the interior and this has resulted in extremely varied internet rates and costs.

The issue of network connectivity was discussed as part of Invest in Datacentre Africa, a bespoke summit collocated within Datacloud Europe 2017, early in June. The summary below attempts to highlight the important points.

Geo-political issues always raise their ugly head. It can be hard to talk about Africa without getting lost in a minefield of mixed meanings. Sub-Saharan countries get lumped together because these are all served by the same cables. North African countries are often not included in discussions about Africa at all because they’re served by a different set of cables. And there can be tendency to ignore Francophone countries altogether and focus exclusively on English speaking ones (although, admittedly, not so much in the networking space).
Read more... 

Source: IDG Connect


If you enjoyed this post, make sure you subscribe to my Email Updates!

Give Teachers Credit: They Know Learning Is Social | EdSurge

Follow on Twitter as @spirrison
"The enthusiasm shared by educators who understand that social media will forever impact their lives and practice is very reminiscent of the vibe expressed by dot-commers two decades ago during the first wave of the Internet boom—this is a very good thing." says Brad Spirrison, Senior Director at Participate.

Photo: Rawpixel.com / Shutterstock
I’ve served as both a journalist and participant within each movement. My job is to interview and survey the pioneers, investors and stakeholders who drive technological change, share their stories, and collaborate with very smart people to build and distribute tools that help everyone else get involved.

The parallels between the early days of the world wide web and today’s edtech scene are surreal. First, you have your tinkerers who recognize the network potential of organizing information, resources and advice around communities. In the nineties, this included Geocities, Lycos and Jerry’s Guide to the World Wide Web (later called Yahoo!). More recently, communities and directories including #edchat, eduClipper and Cybraryman (AKA Jerry Blumengarten’s guide to educational websites and chats) provided voice, structure and inspiration to educators looking to connect and collaborate in ways never before possible.

As more individuals organically buy into the movement, a second layer of investors, opportunists and outright charlatans get involved. In the nineties, I literally wrote half a dozen stories analyzing the hundreds of millions of dollars invested in the online pet foods space. Virtually all of those companies, along with thousands of other venture-backed outfits during that time, turned into doo-doo.

This is also a very good thing. Railroads, telephone networks and the internet could not have been built without financial and emotional excess. Whether you are an investor, participant or observer, the key amidst these periods is to recognize innovations that remain true to the underlying cause of whatever movements they spawn within. This means approaching the very individuals and organizations you want to serve, building trust, sharing stories and identifying what problem they wish to solve.

There is a lot of noise in edtech today, mostly coming from technology and consumer marketing-oriented companies. They are trying to cut and paste solutions they built for one industry and sell them to teachers and administrators because they feel the market is hot. This approach won’t work with passionate educators who recognize that their world is changing because of technology. They don’t have time for doo-doo.

Here’s what teachers are doing with their own time...

Brad Spirrison end his article with following: "We are never going back to how things used to be. Together, we have the opportunity to frame and define what’s next."
Read more...

Source: EdSurge


If you enjoyed this post, make sure you subscribe to my Email Updates!

This is your brain on PhD | Times Higher Education (THE) (blog)

Photo: Steven Franklin
Steven Franklin, visiting tutor in the history department and a PhD candidate at Royal Holloway, University of London lays bare the questions and doubts that go through his mind as he sits down to work on his thesis.

Photo: iStock

When you start a PhD, the first words you hear are: “It’s going to be hard.” As someone just starting out on an academic journey, your natural response “Pah! I’ll prove them all wrong, I’m the exception, not the rule.” But there's a reason they say these things – it’s because a PhD is difficult and sometime torturous too.

Thinking logically about the process, it shouldn’t be difficult at all. You have four years (eight if you’re doing it part-time), and so by my poor maths, it works out at roughly 65 words a day. Easy! We can all do that. I mean I’ve written more here already! Sadly, it’s not that simple – what a pity. That logic doesn’t factor in any time for conceptualising your idea into something achievable, the research, the manipulation of that research into argumentative prose and then the inevitable rewrites.

Still, let’s be generous and say 200 words a day for less than two years and your project will be complete. In fact, you’d have almost written two.

Of course, there are other pressures that every PhD student must deal with. There’s an expectation for us to take some baby steps into the world of academia. We must present our work at seminars and conferences. Get used to our work being criticised and come back stronger from that. After all, no piece of work is ever the finished article. No one, to my knowledge, is yet to write the last word on any piece of history – although there are plenty of academics who’d be disturbed by the thought of their word not being the last.

Conferences are another way to introduce yourself to the academic world. Make a name for yourself. Socialise in the correct circles. These are the people that might one day examine you, become colleagues or write you a reference. We need to make the most of these exchanges. At the end of the day our future depends on it.

Then, if you’re like me, you don’t have funding, and so you must work to make ends meet. Mummy and daddy might be able to support you, but this 28-year-old would prefer some form of independence. I may be a student but I refuse to be seen as the stereotype. I work, undoubtedly more than I should, and I do my work well. One finds that if you work hard and do it to a high enough standard more doors open. People see your use. Before you know it you have an invite to the department Christmas meal. Not a bad achievement given you were employed on a short-term basis to help with some admin.

Factoring in those things, I’m now needing to write in the region of 400 words a day. Thinking about it, maybe a little less. It’s still achievable. Isn’t it? Well of course it is.

But our list does not end there. If you’d like to get anywhere in academia it’s desirable that you've taught. Published an article or two prior to thesis submission. Write a few academic book reviews. These, sadly, suck time. Time we, perhaps just I, do not have.

Let's also pause for a moment to reflect on the poker game that you play with your PhD peers. It’s an unspoken truth, but academia is essentially a game of “my fish is bigger than yours”. It’s not necessarily about quality of produced work. It’s all about quantity. The more you have, the better you are. What “have” can be anything, too. Scholarly works, academic prizes, research scholarships and media contributions are all ways of physically displaying that you’re on your way to greatness.

PhD students play the game as well as anybody else. Why blame them? The very nature of the profession dictates that you must sell yourself at every possible moment and be opportunistic, too. Don't get me wrong. I love my PhD peers but there are times when the game gets tiresome.

So, where am I left now? Ah yes, 500 words a day over 200 days and the job’s done!
Read more... 

Source: Times Higher Education (THE) (blog) 


If you enjoyed this post, make sure you subscribe to my Email Updates!

Explore the diversification of 21st century classrooms and schools | Eduplanet21 - Learning Paths

Check these out below.

Digital Citizenship: Preparing Teachers and Students for Online Interaction 

Digital Citizenship: Preparing Teachers and Students for Online Interaction

Description
Digital Citizenship is defined as the norms of appropriate, responsible behavior with regard to technology use by DigitalCitizenship.net, a leading digital citizenship informational website that we will further explore later in learning path.
Throughout this learning path, you will engage in a variety of activities designed to help you think critically and actively about how to express the ideas of digital citizenship with your students and encourage them to demonstrate positive online behavior. Topics discussed in this pathway will include digital literacy, digital foot prints, social media, plagiarism, privacy, copyrights, standards, and policy

Objectives: 
  • Develop an understanding of the responsibilities and challenges associated with being a digital citizen; 
  • Relate the elements of digital citizenship to professional practice; 
  • Explore resources designed to help incorporate digital citizenship into the classroom;
  •  Review district policy and procedures related to digital literacy and digital learning; and  
  • Recognize standards that are addressed when teaching digital citizenship.
Read more... 

Text-Dependent Analysis: An Introduction 

Text-Dependent Analysis: An Introduction

Description
Text-dependent analysis (TDA) is a unique-to-Pennsylvania term for text-dependent questions. Text-dependent analysis epitomizes why reading and writing have been joined under the umbrella of English/Language Arts. The first module builds background knowledge, explores the instructional shifts, and takes a dive into the Pennsylvania Department of Education’s explanation of Text-Dependent Analysis. Understanding the similarities and differences of open-ended and text-dependent questions, along with exploring common myths and misconceptions, is the focus of the second module.

Objectives:
  • Define Text-Dependent Analysis; 
  • Explore your current understanding of Text-Dependent Analysis; 
  • Expand your knowledge base of the instructional shifts for English/Language Arts; 
  • Tackle several myths and misconceptions of Text-Dependent Analysis and provide examples of best instructional practices; and, 
  • Examine resources provided by the Pennsylvania Department of Education to learn the cognitive demands of Text-Dependent Analysis. 
Read more...

Source: Eduplanet21


If you enjoyed this post, make sure you subscribe to my Email Updates!

Friday, June 23, 2017

Will Free Community College Really Help Low-Income Students? | Education Week - Opinion

Photo: Kate Schwass
"Money isn’t the only obstacle to college completion" argues Kate Schwass, San Francisco Bay Area executive director for CollegeSpring.

Photo: Getty

In the face of soaring college tuitions and skyrocketing educational inequity, many educators and lawmakers are suggesting a way to help low-income students earn degrees: Why not offer in-state students community college for free?
After years in the shadows, the idea is gaining real momentum. Just last month, Tennessee, which already had a free-community-college program for recent high school graduates, announced it will open that program to any adult in the state without a college degree in 2018. Earlier this year, San Francisco became the first city in the country to offer free community college to all of its residents, and lawmakers in California, New York, and Rhode Island introduced similar proposals to cover tuition and other costs for students.

At first glance, it’s hard to see why free community college (specifically, free tuition for two-year schools that grant associate degrees) would be anything but helpful for students from low-income backgrounds. Students who graduate from community college have lower rates of unemployment and earn $6,600 more a year than those who have a high school diploma. Remove the cost of earning an associate’s degree, and you’ll put its benefits within reach of any student who wants one—right?
 
It’s clear that the prospect of free tuition will likely motivate more low-income students to enroll in community college. But those students still face considerable obstacles having nothing to do with money once they arrive on campus. What’s more, free tuition could deter low-income students from pursuing four-year colleges and universities.

Until educators account for these truths, we could be inadvertently pushing large numbers of students away from their best educational path in four-year colleges or universities.

One challenge for low-income community college students is that they are more likely to be needlessly placed in remedial courses than their wealthier peers and those at four-year colleges. Most colleges require that incoming students take standardized placement tests to see if they need remedial reading, writing, and math courses. About 70 percent of low-income community college students are placed in remedial courses, compared with about 50 percent of their wealthier community college peers.

Despite the good intention, remedial courses are a significant barrier to graduation. A 2012 report from the nonprofit Complete College America found that fewer than one in 10 students who begin in remedial courses graduate from community colleges within three years. This is in part because remedial courses don’t count toward a degree—a discouraging prospect for many students. What’s more, the standardized tests that typically determine remediation decisions are often not accurate—as many as a third of all students are forced to review content they already understand well.
Read more...

Source: Education Week


If you enjoyed this post, make sure you subscribe to my Email Updates!

LBCC program to help adults return to school | Albany Democrat-Herald - In brief

A free program designed to help adult learners go back to school is being offered through Linn-Benton Community College starting June 27.


Classes meet Tuesdays and Thursdays from 2 p.m. to 5:30 p.m. starting June 27 through Aug. 31. Classes cover computer skills, time management, communication skills, critical thinking skills, employment skills, learning styles, how to identify strengths and more.

The Empower Program is a strength-based, supportive learning program designed to help with transitioning into LBCC and being successful in college.

Empower classes are free, with a focus on college and career readiness, self-development, self-assessment, goal-setting and planning, support network development, and overall well-being.

Participants learn to increase their ability to succeed, realize potential, develop a network of college and community support, and become empowered in lifelong success in college, professional and personal development.

For more information, contact Malinda Shell at  email at shellm@linnbenton.edu.


If you enjoyed this post, make sure you subscribe to my Email Updates!

Scholar Documents Economic War on Chinese Restaurant | Diverse: Issues in Higher Education

"New research reveals another level of the United States’ exploitation of and discrimination against Chinese Americans. University of California, Davis law professor Gabriel “Jack” Chin and research assistant John Ormonde have uncovered evidence of economic bias over the years against Chinese restaurants through a search of digital archives." notes Gia Savage, Journalist.

Photo: Gabriel “Jack” Chin
Chin has been conducting research on race and the law and Asian Americans and the law for over 20 years. He discovered the restaurant bias years ago, but was not completely sure of how it connected to a broader historical point. He and Ormonde, the study’s co-author, decided to dig deeper.

“We spent some time to see what else we could find out about the treatment of Chinese restaurants in this period,” said Chin.

By using digital archives, Chin and Ormond were able to find a disadvantageous pattern of jobs and economic growth being withheld from Chinese restaurants.

“It shows an unfortunate tradition of good jobs being reserved for Whites,” said Chin. “The unions that approached the Chinese restaurants frankly and explicitly argued that Whites should patronize 
White restaurants, and give their business to White people, and Chinese shouldn’t have these opportunities.”

Though once inaccessible because their existence was limited to paper copy, archived newspapers, records of city council proceedings and outdated state codes are what led to the discovery of a “war” on Chinese restaurants that lasted over 30 years. These documents have been digitized, which makes them more accessible to those interested in viewing.

As Chinese food and restaurants became popular in the United States, traditional American restaurants were suddenly at risk of losing business.
Read more...

Source: Diverse: Issues in Higher Education 


If you enjoyed this post, make sure you subscribe to my Email Updates!

Companies must move fast to take advantage of digitally skilled population in the mainland and Hong Kong | South China Morning Post

Photo: Gianfranco Casati
"A high level of digital readiness displayed by both male and female undergraduates is the new weapon of choice to foster growth, according to several shining examples on mainland’s corporate horizon" says Gianfranco Casati, Accenture’s group chief executive.

Executives should act quickly to tap digital skill set of their staff 
Photo: Nora Tam


Hop on a mainland-bound train in Hung Hom station and you are likely to see the same thing you would see on any other MTR commuter train: people sat hunched over their mobile phones, engaged on WeChat or playing Honor of Kings, one of the mobile games by Tencent Holdings, listening to music or reading work emails.

Herein lies a weapon for corporate growth – digital readiness.

The mainland’s digital readiness, internet technology expertise, use and investment will serve as a powerful base to spur growth. For example, by harnessing the power of digital, China stands to grow its gross domestic product by 3.75 per cent by 2020, the equivalent of adding US$527 billion to the economy during that time frame.

Consider the fundamentals. Earlier this year, we commissioned research on women in the workforce with an eye to see what skills are needed to help women attain pay parity with men.

One of the ancillary insights the research yielded was that in many emerging markets, particularly the mainland, women tended to be as schooled in IT as men, creating a much larger potential pool of candidates for such work.

We found that the digital capabilities of male and female undergraduates in the mainland were fairly equal – 96 per cent of the undergraduate women surveyed said they had taken computing/coding module classes (against 100 per cent of the surveyed men), and 73 per cent of the women students said they thought they adopted new technologies fast, as compared with 79 per cent of the men. A graduate base of nearly universal tech savviness is an essential building block for future ready businesses and nations that can rotate to the new world order.

Managers need to harness this digital strength. Offering corporate training on mobile phones, to be finalised when an employee chooses, gives staff the option to use their commuting time for continuous learning. It makes retooling and refining skills more flexible.

Source: South China Morning Post


If you enjoyed this post, make sure you subscribe to my Email Updates!

Malvern International PLC Launch Its First Digital Learning Technology | Malvern International plc - Education

"Malvern International plc (AIM: MLVN), the provider of educational services in the UK, Europe and Asia, is pleased to announce, that following the strategic plan announced in November 2016 and as part of its collaboration with Playware Studios Singapore (announced in January 2017), the launch of its first digital offering in the area of Learning Technology." continues Malvern International plc.


Photo: GraphicStock.com

Malvern and Playware Studios, are offering the platform and relevant service to allow potential customers to create customised mobile based games that can be used for learning and corporate training. The product is not only useful for schools but also can be used by all industries with training needs.

The mobile games produced through the platform can increase user engagement and enhance knowledge retention, with a practice-based learning experience that is fresh, interesting and effective. The platform can also be used to create virtual labs, simulations and interactive case studies, and direct training and testing games where learners can experiment with different scenarios through their mobiles.  

The product has a proven record of success in improving training engagement and efficiency within the classroom or workplace. As an innovative technology platform 3DHive.mobi has been recognised internationally with awards such as: the Brandon Hall Group Awards for Excellence in Technology and Excellence in Education (2015), Microsoft Partner of the Year (Public Sector Education 2013), BETT Asia and IDA Award for EdTech Innovation (2014) and the Innovplus Flame Award from WDA Singapore (2016).

The platform and series of sample games have been tested with a few large and small organisations in Singapore. Malvern has commenced marketing the product in East Asia, India, Nepal, and the UK. It will continue the expansion to cover other countries through a strategic distribution model. This initiative forms part of the expansion of Malvern’s Learning Technology division which is seeking to position itself as a global learning technology partner. More information can be found through Malvern International’s website.


If you enjoyed this post, make sure you subscribe to my Email Updates!

Thursday, June 22, 2017

UAB scholars study in Japan, Cuba, and France | Birmingham Times

Follow on Twitter as @TiffanyWestry
"Four University of Alabama at Birmingham students are among 1,200 undergraduate students from 354 colleges and universities across the United States selected to receive the Benjamin A. Gilman International Scholarship" inform Tiffany Westry Womack, Education; anthropology; language and communication studies; government; history; philosophy; social work; sociology; computer and information sciences; justice sciences; exercise science.

University of Alabama at Birmingham
Photo: UAB.edu

The Gilman Scholarship is sponsored by the U.S. Department of State’s Bureau of Educational and Cultural Affairs. The program aims to make study abroad experiences accessible to a more diverse population of students and to encourage students to choose less traditional study abroad destinations. It also gives students the opportunity to gain a better understanding of other cultures, countries, languages and economies — making them better prepared to assume leadership roles in government and the private sector.

Students are selected for the Gilman Scholarship through a highly competitive application process. The program receives more than 10,000 applications each year and awards about 2,500 scholarships. Gilman scholars are awarded up to $5,000 toward their study abroad or internship program costs. The program aims to support students who traditionally have been underrepresented in education abroad, including, but not limited to, students with high financial need, first-generation college students, students in STEM fields, students from diverse ethnic backgrounds and students with disabilities.

Kenneth Davis, a sophomore double-majoring in chemistry and mathematics with a minor in Japanese, is studying in Japan. Davis is a native of Selma, Alabama, and is in the UAB Honors College’s Science and Technology Honors Program. He is also a recipient of the Freeman-ASIA Scholarship to study abroad. Davis plans to obtain a master’s degree in mathematics before applying to medical school to become a neurosurgeon.

“Spending the summer in Japan is fulfilling a curiosity about Japanese culture that I have fostered since I was a child,” Davis said. “Now that I am able to actually live and interact with native Japanese speakers in Tokyo and surrounding prefectures, I have learned that I am becoming more aware of the subtle differences between my home and Japan. It is my hope that proficiency in Japanese and a mathematics degree will make me a more marketable job candidate.”


If you enjoyed this post, make sure you subscribe to my Email Updates!

Wednesday, June 21, 2017

The most forward-thinking, future-proof college in America teaches every student the exact same stuff

College is supposed to help young people prepare for the future. But as headlines warn that automation and technology may change—or end—work as we know it, parents, students, and universities are grappling with a new question: How do you educate a new generation for a world we can’t even imagine?


Photo: Peter Marber
As David Brooks of the New York Times recently wrote, the college has the “courage to be distinct.” notes Peter Marber, teaches at Harvard and Johns Hopkins

Great books. 
Photo: Unsplash/Roman Kraft

A recent Pew Research Center survey of 1,408 technology and education professionals suggested that the most valuable skills in the future will be those that machines can’t yet easily replicate, like creativity, critical thinking, emotional intelligence, adaptability and collaboration. In short, people need to learn how to learn, because the only hedge against a fast-changing world is the ability to think, adapt and collaborate well.

Academically Adrift:
Limited Learning on College Campuses
But many American college students may not be learning them at all. In the 2011 book Academically Adrift: Limited Learning on College Campuses, Richard Arum and Jarip Roksa chronicled how few American students really improved cognitively–and learned to learn–during their undergraduate education. Few bachelor’s programs require sufficient amounts of the reading, writing, and discourse needed to develop critical thinking skills. In fact, forty percent of American undergraduates now major in business and management-related subjects, reading mainly textbooks and short articles, and rarely writing a paper longer than three pages. Further, the social bonds and skills formed in college today often center on extracurriculars that have little connection to cognitive development and collaborative problem-solving.

But perhaps instead of reinventing higher education, we can give students what they need for the future by returning to the roots of liberal arts. Consider St. John’s College, America’s third-oldest institution of higher education, founded in 1696. With fewer than 700 students between two campuses in Annapolis and Santa Fe, St. John’s is a bit under the radar. But it’s emerged as one of the most distinctive colleges in the country by maintaining a strict focus on the classics of the Western canon...
  
The Program’s philosophy and practice 
You will not find 100-person lectures, teaching assistants or multiple-choice tests at St. John’s. Instead classes are led by “Tutors” who guide students through Socratic inquiry (and yes, students do read about the Socratic practice during freshman year in Plato’s Theaetetus). Despite its reputation as a sadistic exercise in student humiliation, the Socratic method is actually an interactive form of intellectual sandpapering that smooths out hypotheses and eliminates weak ideas through group discourse. Tutors lead St. John’s discussions but rarely dominate; they are more like conversation facilitators, believing that everyone in class is a teacher, everyone a learner. And you won’t find Johnnies texting or surfing social media while in class; there is no place to hide in classrooms that range from small (seminars, 20 students led by two tutors) to smaller (tutorials, 10 to 15 students, one tutor) to smallest (preceptorials, 3 to 8 students, one tutor).

There is a formality in a St. John’s classroom—an un-ironic seriousness—that feels out of another era. Students and Tutors address each other by “Mr.” or “Ms.” (or the gender-inclusive honorific of choice). Classrooms have a retro feel, with rectangular seminar tables and blackboards on surrounding walls, and science labs filled with analog instruments, wood and glass cabinets, old school beakers and test tubes.

You have to observe a few St. John’s classes to get a sense of what’s happening between and among the students and Tutors. Discussions are often free-flowing, with students thinking out loud and talking to the ceiling; you can almost hear the gears turning in their brains. There are many “a-ha” moments in a St. John’s classroom, sometimes coaxed out by Tutors in Socratic fashion. But often they are triggered by students theorizing and responding among themselves.

In one class I attended, students were covering Ptolemy, the second century mathematician. Ptolemy believed the all the celestial bodies and sun revolved around the earth in a circle, and based all his mathematical calculations on this perspective. Students were buzzing at the blackboard, working with a geometry sphere around the table, talking about diameters, meridians and equators, tilts, and horizons. Keep in mind this is all prep for what will be studied in a few months, when these Johnnies will learn that it would be another 1400 years before Copernicus proved Ptolemy’s calculations correct but his conclusion wrong: the earth and planets actually revolve around the sun. These same students will eventually feel the excitement learning of Kepler’s conclusion 150 years later, that Copernicus was also right and wrong: yes, the earth and planets revolved around the sun—but in an elliptical, not circular, orbit. This curricular layering is central to the St. John’s Program. Later texts respond to and build upon previous texts. In essence, students intellectually follow modern thought as it has been built over the last 2000+ years instead of just memorizing the end results.

The cognitive rigor, immersion, and passion so present at St. John’s are rare on American campuses these days. Johnnies read roughly 100-150 books during their four years and write 25 to 30 papers that are more than 10 pages long. Seniors choose a writer or single text and do a deep dive thesis that typically runs 40-50 pages. Here are a few of the senior capstone topics for the class of 2017: 19th century English scientist Michael Faraday’s heuristic description of electromagnetic phenomena; 17th-century mathematician Gottfried Wilhelm Leibniz’s treatment of curvature in what’s called the “chain line” problem; the use of Aristotelian terminology by 20th century physicist Werner Heisenberg in describing quantum mechanics; and the possible revision of “space” from Immanuel Kant’s The Critique of Pure Reason into a plurality of “spaces.” Few college-educated outsiders may have a clue what any of these papers are about, but they are not atypical of what’s being studied, discussed, and written about at St. John’s.
Read more... 

Source: Quartz


If you enjoyed this post, make sure you subscribe to my Email Updates!