Thursday, December 29, 2016

How To Develop the GAMSAT Section 1 Mindset

gamsat.acer.edu.au
by Mike, : http://thegamsatblog.com/2016/10/develop-gamsat-section-1-mindset/ 

The GAMSAT section 1 (Humanities & Social Science) is quite difficult to prepare for, however, there are a few things candidates can do during preparation to increase their chances of passing and scoring in the top percentile. The key to passing section 1 is developing a ‘GAMSAT Section 1 mindset’. It involves changing the way you think and the flow of your reading.

The best way to prepare for section 1 is by reading a vast range of material. However, reading content after content with no clear strategy on what to look out for is a waste of time and will result in you probably not performing well. You need to practice and train yourself to change two things:
  1. Thought Process – Read between the lines
  2. Flow of Reading  – Understand the vocabulary
I will focus on each element of the GAMSAT Section 1 Mindset in more detail below. 

Thought Process

You can’t simply read materials for enjoyment, only understanding the information the author is getting across is not enough. You need to read between the lines, question everything. You want to put yourself in the  author’s mind. Ask yourself the following questions every time you read an article, blog, poem or novel:
  • What is the subject of conversation?
  • The context of the author?
  • What is the author’s tone? - is it negative, positive or neutral? are they in a bad  or good mood? Is it detached or euphoric?
  • What first impression is the writer creating?  
  • What is the writer’s final opinion? 
  • What setting has the writer has created?
  • What perspective is the author addressing in the subject? - observe the language the author uses and the style in which s/he writes reflect his/her overall sentiment.
  • What words or phrase does the author use? and why?
  • What is the personality of the characters used? Look into the characters actions  or dialogues used in the passage.
  • What is the setting? (for fiction or poems).
These are the key things you want to try to point out when reading, try to understand the author’s mindset and read between the lines. I recommend summarising articles, poems, and novels with the above questions. The more you practice the better you’ll get as this. 

Flow of Reading

Improving your flow of reading will help with understanding the vocabulary being used in the shortest time possible. One of the major difficulty in the GAMSAT section 1 is timing, you need to be able to attempt 75 questions in 100 mins with 10 mins reading time. So it is important to develop your flow of reading. Once you’ve had a bit of practice and have developed your train of thought. Start timing yourself when reading, work on  the following:
  • Identifying keywords or topic of discussion by skimming through content.
  • Speed reading - gather necessary additional information at a quicker pace. Try using spreeder.com to read articles.
  • Understand commonly used metaphors or figures of speech.
  • Avoid omitting details - practice summarising every sentence or paragraph in your head. 
With enough practice, you can improve both elements and increase your overall score. Take the time to develop the GAMSAT section 1 mindset and you will see a huge difference in how your overall score.

Good luck!

Mike.

Thursday, December 22, 2016

Busting the ‘Neuromyths’ About How We Learn

English: Dominant learning style of target aud...
Dominant learning style of target audience (Wikipedia)
Duncan Geere, Gothenburg-based freelance science, technology and culture journalist. Editor at http://www.howwegettonext.com. Email me at radio.edit@gmail.com, on How we Get to Next: https://howwegettonext.com/busting-the-neuromyths-about-how-we-learn-df4d0cee3e56#.hhhcbx5eh
 
Some people learn best by doing, right? Others have a visual memory, and it’s important for them to see something depicted if they want to remember it. Then there’s those who learn most effectively through reading and writing, and another group takes on new ideas best if they hear them. 
 
This idea of different “learning styles” is widely accepted among a huge proportion of the public. But there’s one major problem - there’s no evidence that it’s true. 
 
In a 2008 paper, four psychologists reviewed every study ever conducted on learning styles, dating all the way back to the 1920s. They found loads of evidence that both kids and adults will, if asked, express a preference about how they like information to be presented to them. They also found that some people are definitively better than others at processing different kinds of information.
 
What they didn’t find was any evidence that those two truths interact - that an instructional method that proves effective for students with one learning preference is not equally as effective for students with a different one. While there have been a lot of studies on learning styles, only a handful were designed to adequately test their validity in the classroom. Of those that did, several contradicted this accepted wisdom about how we learn best.
 
It’s important to note that a lack of evidence for something is not the same as actively disproving it. But it should also be said that disproving learning styles would require a far, far higher base of evidence, and may even be impossible. 
 
“At present, there is no adequate evidence base to justify incorporating learning-styles assessments into general educational practice,” the psychologists wrote in 2008. “Limited education resources would better be devoted to adopting other educational practices that have a strong evidence base, of which there are an increasing number.” 
 
Learning styles isn’t the only flawed belief we have about the ways we learn. In fact, there’s a whole laundry list of “neuromyths,” and some are more insidious than others.
 
Let’s start with the persistent myth of humans only using 10% of our brains. You can probably guess that this one just isn’t true - it’s unlikely that evolution and/or God would have provided us with a bodily organ that’s 90% useless. In reality, we use almost every part of the brain over a 24-hour period, though small groups of neurons go through constant cycles of being dormant and active. “Evidence would show over a day you use 100% of the brain,” John Henley, a neurologist at the Mayo Clinic, told Scientific American in 2008.
 
Then there’s the one about people being “left-brained” or “right-brained” based on their personality and cognitive style. Left-brained people, it’s believed, have a more logical and methodological approach, while right-brained people are more creative and intuitive.
 
But in 2013 a team of neuroscientists investigated whether this assessment had merit by looking at MRI brain images from more than a thousand volunteers between the ages of 7 and 29. They found that while certain networks of neurons tended to be stronger on either the left or right hemisphere of individual brains, that side preference didn’t hold true for the entire brain. “Our data are not consistent with a whole-brain phenotype of greater ‘left-brained’ or greater ‘right-brained’ network strength across individuals,” they concluded. 
 
These types of myths might be easy to debunk, but other fallacies are more deeply ingrained in our education systems and harder to root out. Take early childhood education, for example. 
 
The human brain grows so much in a child’s first five years that it would seem obvious that preschool programs would have a huge effect on cognitive development. Except that meta-studies show that by the age of 8 it’s almost impossible to tell which children had preschool education and which didn’t. So while your little darling might seem to be enjoying those early days in the classroom, it has no detectable long-term effect on his or her gray matter.
 
Similarly, much has been said about the importance of play in child development. But in 2013 a group of psychologists reviewed 40 years of studies before writing: “Our take-away message is that existing evidence does not support strong causal claims about the unique importance of pretend play for development, and that much more and better research is essential for clarifying its possible role.” It’s entirely plausible that play is merely one of many routes to development, or perhaps a secondary effect of those development strategies.
 
The concept of “digital natives” is pretty questionable, too - the idea that kids who have grown up with the web have somehow developed the ability to do many different things at the same time in a way that their parents can’t. In actuality, studies show that today’s university students use a very limited range of largely established services (Google, Facebook, Wikipedia, etc.) for both learning and socializing, and don’t have an especially deep knowledge of technology. As for multitasking, kids have become practiced at it, sure, but they still suffer the exact same cognitive setbacks that non-“digital natives” do when trying to do several things at once.
 
“There is overwhelming evidence that [digital natives] do not exist,” wrote psychologists Paul A. Kirschner and Jeroen J.G. van Merriënboer in a study of urban legends in education in 2013. “They are not capable of doing that with modern technologies which is ascribed to their repertoire,” they said, and “they actually may ‘suffer’ if teaching and education tries to play into these so-called abilities to relate to, work with, and control their own learning in multimedia and digitally pervasive environments.”
 
When it comes to the classroom, perhaps the most sinister practice of all is medicating students who don’t perform well. In a 2015 review titled “What Doesn’t Work in Education: The Politics of Distraction” (full disclosure: the report was published by Pearson, a partner sponsor of this month’s stories), John Hattie wrote: “There has been a major increase in the number of children who come to school each day pre-labelled. In my own state, Victoria, the incidence of autism and Asperger’s has increased 340% in the past three years.”
 
He continued: “Although diagnostic tests may have improved, it is hard to believe that these major increases in incidence are real. One potential reason for the increase might be parents’ (and teachers’) desire to seek an explanation for ‘unusual’ behaviours and the medical and pharmaceutical professions’ ready provision of answers (and drugs). Another potential reason for the spike might be the extra funding that is tied to students who are labelled as autistic.”
 
Hattie was very clear not to claim that ADHD and autism aren’t real; they are, he said. “Instead, I believe that the massive increase in the frequency of these labels points to a potential cultural problem: students are being diagnosed and labelled primarily for financial and accountability reasons rather than for the enactment of appropriate educational interventions.” 
 
These educational myths are not an insignificant problem - they affect teachers just as much as they do the general public. In a 2012 study, 242 teachers in the United Kingdom and Netherlands believed an average of nearly half of the collection of “neuromyths” gathered by the researchers, especially those linked to commercialized education programs like the California nonprofit Brain Gym, which promotes certain physical exercises it says improve children’s ability to learn based entirely on pseudoscience, or the VARK program’s promotion of learning styles.
 
“These myths persist because they spread easily, offer alluring explanations, and simple, practical solutions,” said Harry Fletcher-Wood, an education researcher at the Institute for Teaching in London. “They spread easily because they are relatively simple - albeit dressed to impress in pseudoscientific explanations.”
 
In 2014, Stanford’s Jack Schneider wrote a book aiming to help scientists spread evidence-based strategies in education called From the Ivory Tower to the Schoolhouse: How Scholarship Becomes Common Knowledge in Education. In it, Schneider lists four factors that any idea must have if teachers are going to notice, accept, use, and share it. It’s clear, however, that these same factors are just as good at spreading pseudoscience.
 
The first factor asks if the idea is relevant to something teachers experience, and whether there appears to be evidence to back it up. Most educational myths that persist deal with situations teachers come across a lot, and they’re based around enough neuroscience that they sound plausible to someone who hasn’t studied them in depth.
 
The second factor, acceptance, means the idea presented must be compatible with the inner values of teachers. Many educators like to believe that they can find creative methods for teaching their students even inside the rigid, one-size-fits-all system they grew up with - so the more ideas sound like they can be personalized to a student, the more likely they are to be looked upon favorably.
 
The third, usage, looks at how easily an idea can be implemented in the classroom. It’s fairly simple to create a lesson that takes learning styles, the left brain/right brain myth, or the importance of play into account, for example. And while it’s harder for an individual teacher to spread the idea that preschool education is vital or that kids who aren’t performing well in the classroom may have mental health issues, these concepts take hold at a higher level among those setting education and health policy.
 
The fourth and final factor relies on how spreadable the idea is. Does it require years of training to learn it, or can it be picked up in a half-hour conference session? Ideas that fit the latter description are much more likely to go viral for obvious reasons - they’re easy to communicate.
 
“What we are dealing with here is a very popular and very persistent pseudoscience, which jeopardizes both the quality of education and the credibility of the educational sciences,” said Kirschner and van Merriënboer. “There is the risk that we are entering a downward spiral: The popularity of urban legends paints the educational sciences as a mumbo-jumbo science, which in turn makes it increasingly difficult to move valuable innovations in the field of education into practice.”
 
Fletcher-Wood added: “[These myths] offer alluring models which seem to explain much of what we see. And they offer simple solutions: Kids aren’t concentrating - give them a tablet! [Suddenly] they’re digital natives!” Unfortunately, these falsehoods will “remain remarkably stubborn,” he said, “because people tend to discount new information which contracts their existing beliefs.” 
 
Some researchers are more optimistic than others about whether it’s ultimately possible to chase out these misconceptions for good. Kirschner and van Merriënboer are not hopeful. “The step from legend-based education based on pseudoscience to evidence-based education based on science demands a quantum leap,” they wrote. “Rather than a quick change in research methodologies or objects of study, it requires a fundamental change in scientific attitude.”
 
But in a 2012 article in the journal Frontiers in Psychology, Sanne Dekker, Nikki Lee, Paul Howard-Jones and Jelle Jolles describe how work is already beginning to establish effective methods for chasing out these misbeliefs.
 
“Such intervention studies should be performed according to the principles and approach of evidence-based or evidence-informed practice. This could yield valuable information for the prevention of myths in the future and for the development of valid educational innovations,” they said.
 
Fletcher-Wood picks out what some solutions could look like in the education system. “The first is raising the general level of research literacy,” he says. “Helping people to spot the difference between a randomized, controlled trial and opinion, based on a handful of surveys. This may sound obvious, but pseudo-experts and the media can both be guilty of promoting work as ‘research’ which does not meet basic quality guidelines.”
 
The second is the “meme-ification” of research, an idea that will no doubt strike fear into the hearts of teachers around the world. “The Learning Scientists’ blogs and posters are an interesting way of trying to share complicated but true research findings in an easy and accessible way,” says Fletcher-Wood. 
 
“We can’t expect everyone to spend their evenings reading peer-reviewed papers; we can present genuine research more conveniently. This brings its own problems - research as meme wars - but it makes us no worse off than we were previously. The other solution is to ensure that those who’ve read around neuromyths combat these ideas humbly but persistently.”
 
Ultimately, the research shows that teachers are interested in learning about the brain and its role in learning. That’s encouraging, wrote Dekker and her colleagues in their 2012 editorial, adding: “Although the integration of neuroscience in educational practice remains challenging, joint efforts of scientists and practitioners may pave the way toward a successful collaboration between the two fields.”

George Orwell’s Six Rules for Writing Clear and Tight Prose

orwell writing rules
Image via Creative Commons
by , Open Culture: http://www.openculture.com/2016/05/george-orwells-six-rules-for-writing-clear-and-tight-prose.html

Most everyone who knows the work of George Orwell knows his 1946 essay “Politics and the English Language” (published here), in which he rails against careless, confusing, and unclear prose.

“Our civilization is decadent,” he argues, “and our language … must inevitably share in the general collapse.” The examples Orwell quotes are all guilty in various ways of “staleness of imagery” and “lack of precision.”

Ultimately, Orwell claims, bad writing results from corrupt thinking, and often attempts to make palatable corrupt acts: “Political speech and writing are largely the defense of the indefensible.” His examples of colonialism, forced deportations, and bombing campaigns find ready analogues in our own time. Pay attention to how the next article, interview, or book you read uses language “favorable to political conformity” to soften terrible things.

Orwell’s analysis identifies several culprits that obscure meaning and lead to whole paragraphs of bombastic, empty prose: 

Dying metaphors: essentially clichés, which “have lost all evocative power and are merely used because they save people the trouble of inventing phrases for themselves.” 

Operators or verbal false limbs: these are the wordy, awkward constructions in place of a single, simple word. Some examples he gives include “exhibit a tendency to,” “serve the purpose of,” “play a leading part in,” “have the effect of” (one particular peeve of mine when I taught English composition was the phrase “due to the fact that” for the far simpler “because”).

Pretentious diction: Orwell identifies a number of words he says “are used to dress up a simple statement and give an air of scientific impartiality to biased judgments.” He also includes in this category “jargon peculiar to Marxist writing” (“petty bourgeois,” “lackey,” “flunkey,” “hyena”). 

Meaningless words: Abstractions, such as “romantic,” “plastic,” “values,” “human,” “sentimental,” etc. used “in the sense that they not only do not point to any discoverable object, but are hardly ever expected to do so by the reader.” Orwell also damns such political buzzwords as “democracy,” “socialism,” “freedom,” “patriotic,” “justice,” and “fascism,” since they each have “several different meanings which cannot be reconciled with one another.”

Most readers of Orwell’s essay inevitably point out that Orwell himself has committed some of the faults he finds in others, but will also, with some introspection, find those same faults in their own writing. Anyone who writes in an institutional context - be it academia, journalism, or the corporate world - acquires all sorts of bad habits that must be broken with deliberate intent.

“The process” of learning bad writing habits “is reversible” Orwell promises, “if one is willing to take the necessary trouble.” How should we proceed? These are the rules Orwell suggests:
(i) Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
(ii) Never use a long word where a short one will do.
(iii) If it is possible to cut a word out, always cut it out.
(iv) Never use the passive where you can use the active.
(v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
(vi) Break any of these rules sooner than say anything outright barbarous.
What constitutes “outright barbarous” wording he does not say, exactly. As the internet cliché has it: Your Mileage May Vary. You may find creative ways to break these rules without thereby being obscure or justifying mass murder.

But Orwell does preface his guidelines with some very sound advice: “Probably it is better to put off using words as long as possible and get one’s meaning as clear as one can through pictures and sensations.

Afterward one can choose - not simply accept - the phrases that will best cover the meaning.” Not only does this practice get us closer to using clear, specific, concrete language, but it results in writing that grounds our readers in the sensory world we all share to some degree, rather than the airy word of abstract thought and belief that we don’t.

These “elementary” rules do not cover “the literary use of language,” writes Orwell, “but merely language as an instrument for expressing and not for concealing or preventing thought.” In the seventy years since his essay, the quality of English prose has likely not improved, but our ready access to writing guides of all kinds has. Those who care about clarity of thought and responsible use of rhetoric would do well to consult them often, and to read, or re-read, Orwell’s essay.

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness