Thursday, December 29, 2016

How To Develop the GAMSAT Section 1 Mindset

gamsat.acer.edu.au
by Mike, : http://thegamsatblog.com/2016/10/develop-gamsat-section-1-mindset/ 

The GAMSAT section 1 (Humanities & Social Science) is quite difficult to prepare for, however, there are a few things candidates can do during preparation to increase their chances of passing and scoring in the top percentile. The key to passing section 1 is developing a ‘GAMSAT Section 1 mindset’. It involves changing the way you think and the flow of your reading.

The best way to prepare for section 1 is by reading a vast range of material. However, reading content after content with no clear strategy on what to look out for is a waste of time and will result in you probably not performing well. You need to practice and train yourself to change two things:
  1. Thought Process – Read between the lines
  2. Flow of Reading  – Understand the vocabulary
I will focus on each element of the GAMSAT Section 1 Mindset in more detail below. 

Thought Process

You can’t simply read materials for enjoyment, only understanding the information the author is getting across is not enough. You need to read between the lines, question everything. You want to put yourself in the  author’s mind. Ask yourself the following questions every time you read an article, blog, poem or novel:
  • What is the subject of conversation?
  • The context of the author?
  • What is the author’s tone? - is it negative, positive or neutral? are they in a bad  or good mood? Is it detached or euphoric?
  • What first impression is the writer creating?  
  • What is the writer’s final opinion? 
  • What setting has the writer has created?
  • What perspective is the author addressing in the subject? - observe the language the author uses and the style in which s/he writes reflect his/her overall sentiment.
  • What words or phrase does the author use? and why?
  • What is the personality of the characters used? Look into the characters actions  or dialogues used in the passage.
  • What is the setting? (for fiction or poems).
These are the key things you want to try to point out when reading, try to understand the author’s mindset and read between the lines. I recommend summarising articles, poems, and novels with the above questions. The more you practice the better you’ll get as this. 

Flow of Reading

Improving your flow of reading will help with understanding the vocabulary being used in the shortest time possible. One of the major difficulty in the GAMSAT section 1 is timing, you need to be able to attempt 75 questions in 100 mins with 10 mins reading time. So it is important to develop your flow of reading. Once you’ve had a bit of practice and have developed your train of thought. Start timing yourself when reading, work on  the following:
  • Identifying keywords or topic of discussion by skimming through content.
  • Speed reading - gather necessary additional information at a quicker pace. Try using spreeder.com to read articles.
  • Understand commonly used metaphors or figures of speech.
  • Avoid omitting details - practice summarising every sentence or paragraph in your head. 
With enough practice, you can improve both elements and increase your overall score. Take the time to develop the GAMSAT section 1 mindset and you will see a huge difference in how your overall score.

Good luck!

Mike.

Thursday, December 22, 2016

Busting the ‘Neuromyths’ About How We Learn

English: Dominant learning style of target aud...
Dominant learning style of target audience (Wikipedia)
Duncan Geere, Gothenburg-based freelance science, technology and culture journalist. Editor at http://www.howwegettonext.com. Email me at radio.edit@gmail.com, on How we Get to Next: https://howwegettonext.com/busting-the-neuromyths-about-how-we-learn-df4d0cee3e56#.hhhcbx5eh
 
Some people learn best by doing, right? Others have a visual memory, and it’s important for them to see something depicted if they want to remember it. Then there’s those who learn most effectively through reading and writing, and another group takes on new ideas best if they hear them. 
 
This idea of different “learning styles” is widely accepted among a huge proportion of the public. But there’s one major problem - there’s no evidence that it’s true. 
 
In a 2008 paper, four psychologists reviewed every study ever conducted on learning styles, dating all the way back to the 1920s. They found loads of evidence that both kids and adults will, if asked, express a preference about how they like information to be presented to them. They also found that some people are definitively better than others at processing different kinds of information.
 
What they didn’t find was any evidence that those two truths interact - that an instructional method that proves effective for students with one learning preference is not equally as effective for students with a different one. While there have been a lot of studies on learning styles, only a handful were designed to adequately test their validity in the classroom. Of those that did, several contradicted this accepted wisdom about how we learn best.
 
It’s important to note that a lack of evidence for something is not the same as actively disproving it. But it should also be said that disproving learning styles would require a far, far higher base of evidence, and may even be impossible. 
 
“At present, there is no adequate evidence base to justify incorporating learning-styles assessments into general educational practice,” the psychologists wrote in 2008. “Limited education resources would better be devoted to adopting other educational practices that have a strong evidence base, of which there are an increasing number.” 
 
Learning styles isn’t the only flawed belief we have about the ways we learn. In fact, there’s a whole laundry list of “neuromyths,” and some are more insidious than others.
 
Let’s start with the persistent myth of humans only using 10% of our brains. You can probably guess that this one just isn’t true - it’s unlikely that evolution and/or God would have provided us with a bodily organ that’s 90% useless. In reality, we use almost every part of the brain over a 24-hour period, though small groups of neurons go through constant cycles of being dormant and active. “Evidence would show over a day you use 100% of the brain,” John Henley, a neurologist at the Mayo Clinic, told Scientific American in 2008.
 
Then there’s the one about people being “left-brained” or “right-brained” based on their personality and cognitive style. Left-brained people, it’s believed, have a more logical and methodological approach, while right-brained people are more creative and intuitive.
 
But in 2013 a team of neuroscientists investigated whether this assessment had merit by looking at MRI brain images from more than a thousand volunteers between the ages of 7 and 29. They found that while certain networks of neurons tended to be stronger on either the left or right hemisphere of individual brains, that side preference didn’t hold true for the entire brain. “Our data are not consistent with a whole-brain phenotype of greater ‘left-brained’ or greater ‘right-brained’ network strength across individuals,” they concluded. 
 
These types of myths might be easy to debunk, but other fallacies are more deeply ingrained in our education systems and harder to root out. Take early childhood education, for example. 
 
The human brain grows so much in a child’s first five years that it would seem obvious that preschool programs would have a huge effect on cognitive development. Except that meta-studies show that by the age of 8 it’s almost impossible to tell which children had preschool education and which didn’t. So while your little darling might seem to be enjoying those early days in the classroom, it has no detectable long-term effect on his or her gray matter.
 
Similarly, much has been said about the importance of play in child development. But in 2013 a group of psychologists reviewed 40 years of studies before writing: “Our take-away message is that existing evidence does not support strong causal claims about the unique importance of pretend play for development, and that much more and better research is essential for clarifying its possible role.” It’s entirely plausible that play is merely one of many routes to development, or perhaps a secondary effect of those development strategies.
 
The concept of “digital natives” is pretty questionable, too - the idea that kids who have grown up with the web have somehow developed the ability to do many different things at the same time in a way that their parents can’t. In actuality, studies show that today’s university students use a very limited range of largely established services (Google, Facebook, Wikipedia, etc.) for both learning and socializing, and don’t have an especially deep knowledge of technology. As for multitasking, kids have become practiced at it, sure, but they still suffer the exact same cognitive setbacks that non-“digital natives” do when trying to do several things at once.
 
“There is overwhelming evidence that [digital natives] do not exist,” wrote psychologists Paul A. Kirschner and Jeroen J.G. van Merriënboer in a study of urban legends in education in 2013. “They are not capable of doing that with modern technologies which is ascribed to their repertoire,” they said, and “they actually may ‘suffer’ if teaching and education tries to play into these so-called abilities to relate to, work with, and control their own learning in multimedia and digitally pervasive environments.”
 
When it comes to the classroom, perhaps the most sinister practice of all is medicating students who don’t perform well. In a 2015 review titled “What Doesn’t Work in Education: The Politics of Distraction” (full disclosure: the report was published by Pearson, a partner sponsor of this month’s stories), John Hattie wrote: “There has been a major increase in the number of children who come to school each day pre-labelled. In my own state, Victoria, the incidence of autism and Asperger’s has increased 340% in the past three years.”
 
He continued: “Although diagnostic tests may have improved, it is hard to believe that these major increases in incidence are real. One potential reason for the increase might be parents’ (and teachers’) desire to seek an explanation for ‘unusual’ behaviours and the medical and pharmaceutical professions’ ready provision of answers (and drugs). Another potential reason for the spike might be the extra funding that is tied to students who are labelled as autistic.”
 
Hattie was very clear not to claim that ADHD and autism aren’t real; they are, he said. “Instead, I believe that the massive increase in the frequency of these labels points to a potential cultural problem: students are being diagnosed and labelled primarily for financial and accountability reasons rather than for the enactment of appropriate educational interventions.” 
 
These educational myths are not an insignificant problem - they affect teachers just as much as they do the general public. In a 2012 study, 242 teachers in the United Kingdom and Netherlands believed an average of nearly half of the collection of “neuromyths” gathered by the researchers, especially those linked to commercialized education programs like the California nonprofit Brain Gym, which promotes certain physical exercises it says improve children’s ability to learn based entirely on pseudoscience, or the VARK program’s promotion of learning styles.
 
“These myths persist because they spread easily, offer alluring explanations, and simple, practical solutions,” said Harry Fletcher-Wood, an education researcher at the Institute for Teaching in London. “They spread easily because they are relatively simple - albeit dressed to impress in pseudoscientific explanations.”
 
In 2014, Stanford’s Jack Schneider wrote a book aiming to help scientists spread evidence-based strategies in education called From the Ivory Tower to the Schoolhouse: How Scholarship Becomes Common Knowledge in Education. In it, Schneider lists four factors that any idea must have if teachers are going to notice, accept, use, and share it. It’s clear, however, that these same factors are just as good at spreading pseudoscience.
 
The first factor asks if the idea is relevant to something teachers experience, and whether there appears to be evidence to back it up. Most educational myths that persist deal with situations teachers come across a lot, and they’re based around enough neuroscience that they sound plausible to someone who hasn’t studied them in depth.
 
The second factor, acceptance, means the idea presented must be compatible with the inner values of teachers. Many educators like to believe that they can find creative methods for teaching their students even inside the rigid, one-size-fits-all system they grew up with - so the more ideas sound like they can be personalized to a student, the more likely they are to be looked upon favorably.
 
The third, usage, looks at how easily an idea can be implemented in the classroom. It’s fairly simple to create a lesson that takes learning styles, the left brain/right brain myth, or the importance of play into account, for example. And while it’s harder for an individual teacher to spread the idea that preschool education is vital or that kids who aren’t performing well in the classroom may have mental health issues, these concepts take hold at a higher level among those setting education and health policy.
 
The fourth and final factor relies on how spreadable the idea is. Does it require years of training to learn it, or can it be picked up in a half-hour conference session? Ideas that fit the latter description are much more likely to go viral for obvious reasons - they’re easy to communicate.
 
“What we are dealing with here is a very popular and very persistent pseudoscience, which jeopardizes both the quality of education and the credibility of the educational sciences,” said Kirschner and van Merriënboer. “There is the risk that we are entering a downward spiral: The popularity of urban legends paints the educational sciences as a mumbo-jumbo science, which in turn makes it increasingly difficult to move valuable innovations in the field of education into practice.”
 
Fletcher-Wood added: “[These myths] offer alluring models which seem to explain much of what we see. And they offer simple solutions: Kids aren’t concentrating - give them a tablet! [Suddenly] they’re digital natives!” Unfortunately, these falsehoods will “remain remarkably stubborn,” he said, “because people tend to discount new information which contracts their existing beliefs.” 
 
Some researchers are more optimistic than others about whether it’s ultimately possible to chase out these misconceptions for good. Kirschner and van Merriënboer are not hopeful. “The step from legend-based education based on pseudoscience to evidence-based education based on science demands a quantum leap,” they wrote. “Rather than a quick change in research methodologies or objects of study, it requires a fundamental change in scientific attitude.”
 
But in a 2012 article in the journal Frontiers in Psychology, Sanne Dekker, Nikki Lee, Paul Howard-Jones and Jelle Jolles describe how work is already beginning to establish effective methods for chasing out these misbeliefs.
 
“Such intervention studies should be performed according to the principles and approach of evidence-based or evidence-informed practice. This could yield valuable information for the prevention of myths in the future and for the development of valid educational innovations,” they said.
 
Fletcher-Wood picks out what some solutions could look like in the education system. “The first is raising the general level of research literacy,” he says. “Helping people to spot the difference between a randomized, controlled trial and opinion, based on a handful of surveys. This may sound obvious, but pseudo-experts and the media can both be guilty of promoting work as ‘research’ which does not meet basic quality guidelines.”
 
The second is the “meme-ification” of research, an idea that will no doubt strike fear into the hearts of teachers around the world. “The Learning Scientists’ blogs and posters are an interesting way of trying to share complicated but true research findings in an easy and accessible way,” says Fletcher-Wood. 
 
“We can’t expect everyone to spend their evenings reading peer-reviewed papers; we can present genuine research more conveniently. This brings its own problems - research as meme wars - but it makes us no worse off than we were previously. The other solution is to ensure that those who’ve read around neuromyths combat these ideas humbly but persistently.”
 
Ultimately, the research shows that teachers are interested in learning about the brain and its role in learning. That’s encouraging, wrote Dekker and her colleagues in their 2012 editorial, adding: “Although the integration of neuroscience in educational practice remains challenging, joint efforts of scientists and practitioners may pave the way toward a successful collaboration between the two fields.”

George Orwell’s Six Rules for Writing Clear and Tight Prose

orwell writing rules
Image via Creative Commons
by , Open Culture: http://www.openculture.com/2016/05/george-orwells-six-rules-for-writing-clear-and-tight-prose.html

Most everyone who knows the work of George Orwell knows his 1946 essay “Politics and the English Language” (published here), in which he rails against careless, confusing, and unclear prose.

“Our civilization is decadent,” he argues, “and our language … must inevitably share in the general collapse.” The examples Orwell quotes are all guilty in various ways of “staleness of imagery” and “lack of precision.”

Ultimately, Orwell claims, bad writing results from corrupt thinking, and often attempts to make palatable corrupt acts: “Political speech and writing are largely the defense of the indefensible.” His examples of colonialism, forced deportations, and bombing campaigns find ready analogues in our own time. Pay attention to how the next article, interview, or book you read uses language “favorable to political conformity” to soften terrible things.

Orwell’s analysis identifies several culprits that obscure meaning and lead to whole paragraphs of bombastic, empty prose: 

Dying metaphors: essentially clichés, which “have lost all evocative power and are merely used because they save people the trouble of inventing phrases for themselves.” 

Operators or verbal false limbs: these are the wordy, awkward constructions in place of a single, simple word. Some examples he gives include “exhibit a tendency to,” “serve the purpose of,” “play a leading part in,” “have the effect of” (one particular peeve of mine when I taught English composition was the phrase “due to the fact that” for the far simpler “because”).

Pretentious diction: Orwell identifies a number of words he says “are used to dress up a simple statement and give an air of scientific impartiality to biased judgments.” He also includes in this category “jargon peculiar to Marxist writing” (“petty bourgeois,” “lackey,” “flunkey,” “hyena”). 

Meaningless words: Abstractions, such as “romantic,” “plastic,” “values,” “human,” “sentimental,” etc. used “in the sense that they not only do not point to any discoverable object, but are hardly ever expected to do so by the reader.” Orwell also damns such political buzzwords as “democracy,” “socialism,” “freedom,” “patriotic,” “justice,” and “fascism,” since they each have “several different meanings which cannot be reconciled with one another.”

Most readers of Orwell’s essay inevitably point out that Orwell himself has committed some of the faults he finds in others, but will also, with some introspection, find those same faults in their own writing. Anyone who writes in an institutional context - be it academia, journalism, or the corporate world - acquires all sorts of bad habits that must be broken with deliberate intent.

“The process” of learning bad writing habits “is reversible” Orwell promises, “if one is willing to take the necessary trouble.” How should we proceed? These are the rules Orwell suggests:
(i) Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
(ii) Never use a long word where a short one will do.
(iii) If it is possible to cut a word out, always cut it out.
(iv) Never use the passive where you can use the active.
(v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
(vi) Break any of these rules sooner than say anything outright barbarous.
What constitutes “outright barbarous” wording he does not say, exactly. As the internet cliché has it: Your Mileage May Vary. You may find creative ways to break these rules without thereby being obscure or justifying mass murder.

But Orwell does preface his guidelines with some very sound advice: “Probably it is better to put off using words as long as possible and get one’s meaning as clear as one can through pictures and sensations.

Afterward one can choose - not simply accept - the phrases that will best cover the meaning.” Not only does this practice get us closer to using clear, specific, concrete language, but it results in writing that grounds our readers in the sensory world we all share to some degree, rather than the airy word of abstract thought and belief that we don’t.

These “elementary” rules do not cover “the literary use of language,” writes Orwell, “but merely language as an instrument for expressing and not for concealing or preventing thought.” In the seventy years since his essay, the quality of English prose has likely not improved, but our ready access to writing guides of all kinds has. Those who care about clarity of thought and responsible use of rhetoric would do well to consult them often, and to read, or re-read, Orwell’s essay.

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Tuesday, October 4, 2016

Finishing the PhD: Write a Tiny Text

Here's What You Need to Know About Starting University With Dyslexia

English: Simulation of dyslexic vision
Simulation of dyslexic vision (Photo credit: Wikipedia)
by Harriet Cameron, University of Sheffield, The Conversation: https://theconversation.com/heres-what-you-need-to-know-about-starting-university-with-dyslexia-50035

Going to university can be a test for anyone, fresh, or not-so-fresh from school. Students are not only expected to adapt to independent study and increased reading loads, but they also have to learn as soon as possible how to “do” the kind of academic writing and academic talk their given field demands. And for those students with dyslexia, this can be particularly challenging.

Dyslexic students are normally no different to their non-dyslexic peers in their understanding of their academic subject, but dyslexia can make things like reading course books, writing essays and remembering lecture points harder to do. And there can also be difficulties for dyslexic students in getting their words and ideas across in seminars and tutorials.

These things are hard partly because of specific cognitive difficulties with processing particular kinds of information, and partly because of the way schools and universities tend to structure and assess learning - through non-interactive lectures and timed, written examination. And because there is a lot of disagreement about what dyslexia actually means in terms of cognitive function, it can also be difficult to agree on what to do about it, in practice.


Grade driven learning

In today’s society being academically literate is particularly valued - with the most successful learner often seen as the one who gets the highest grades. High grades are often thought to go hand-in-hard with hard work, meaning lower grades are often thought to imply a lack of effort and a lack of academic ability - the twin-evils of “laziness” and “stupidity”.

But part of the challenge for dyslexia and learning isn’t so much that dyslexic people can’t keep up with complex ideas, it’s more that they may need to approach tasks in a different way to get the learning to make sense, and to “stick”.

So when a student with dyslexia finds their learning preferences don’t fit so well with the learning environments on offer, they will often use additional study aids - such as speech-to-text software, mind-mapping applications and “read and write text help” - in addition to attending regular tutorials with a specialist teacher to work on their academic literacy.


Having dyslexia can make learning difficult at university. Antonio Guillem/Shutterstock
But sometimes dyslexic students (and their peers) feel that using additional study help gives them an an unfair leg-up. This means that although dyslexic students have a right under the law to make use of things - like extra time in exams and specialist tuition - doing so can be a threat to their sense of self-worth and academic identity.

In other words, they can feel like they are not really “intelligent” if they can’t do the work without making use of adjustments. This can lead dyslexic students to play down their difficulties, and to refuse help. And students with dyslexia will sometimes try to go it alone, so to speak, to work hard and “just deal with it” - even though they will be disadvantaged by this approach. This can leave dyslexic students in a lose-lose situation.

Peer support?

Working out when to access support at university is further complicated by the uncertainty of how the students and staff they come across will react to a disclosure of dyslexia. Media representations of dyslexia have tended to be rather sensationalist, and often follow the “dyslexia as a myth” line without care for the details of the studies which they refer to.

Attitudes towards dyslexia among academic staff can also vary, and peers can react in unexpected ways - saying things like “that’s ridiculous, why do you get a printer just because you’re dyslexic?”.

Dyslexic students have to be ever-ready to explain what dyslexia means and how it affects them to whomever needs to know. They may need to declare it to their personal tutor one day, to an exams invigilator another, and to their housemate the next. And in each case they need to guess how their declaration will be received - which can be exhausting.

Dyslexic students may also find themselves stuck between contradictory ideas about who they are as a dyslexic, and what they should be doing about it. And in this sense they internalise the apparent “common-sense view” that they are solely responsible for the difficulties they experience.

Rethinking dyslexia

So, to dyslexic students who have just begun their university education, it is time for you to rethink the concept of disability - because it is not a dirty word. The disabling aspects of dyslexia are not inside you, but rather they are part of a particular educational set-up and learning environment.


 Don’t let dyslexia hold you back. Syda Productions/Shutterstock

To tackle this, work out which situations at university put you at a disadvantage compared to other students, and make use of any adjustments you need to help you. It’s not an unfair leg-up, it’s simply a small step towards evening the playing field.

You should also make use of specialist dyslexia tutors, because they not only there to help you develop academic skills and confidence, but more importantly they can also help you critically reflect upon what dyslexia means for you and your learning.

And finally, remember you are not to blame for some of the difficulties you may experience in university learning, so be kind to yourself. These difficulties are nothing to do with how worthy you are, or how “clever” you are - and you belong at university just as much as anyone else does.

Harriet Cameron, Academic Director: specific learning difficulties in higher education, University of Sheffield

This article was originally published on The Conversation. Read the original article.

Tuesday, September 27, 2016

BOOK REVIEW: Seven Steps to a Comprehensive Literature Review: A Multimodal and Cultural Approach by Anthony J. Onwuegbuzie and Rebecca Frels

7 Steps to a Literature Review coverby Impact of Social Sciences: http://blogs.lse.ac.uk/impactofsocialsciences/2016/09/25/book-review-seven-steps-to-a-comprehensive-literature-review-a-multimodal-and-cultural-approach-by-anthony-j-onwuegbuzie-and-rebecca-frels/

In Seven Steps to a Comprehensive Literature Review: A Multimodal and Cultural Approach, Anthony J. Onwuegbuzie and Rebecca Frels offer a new guide on how to produce a comprehensive literature review through seven key steps that incorporate rigour, validity and reliability.

Ana Raquel Nunes recommends this helpful, well-informed and well-organised book to those undertaking literature reviews as well as those reflecting on research methodologies more broadly.
This review originally appeared on LSE Review of Books.

Seven Steps to a Comprehensive Literature Review: A Multimodal and Cultural Approach, by Anthony J. Onwuegbuzie and Rebecca Frels, offers a straightforward guide on how to conduct literature reviews, and is the successor to Onwuegbuzie’s numerous previous works on qualitative, quantitative and mixed methods research. 

The book is a source of in-depth understanding of the role that literature reviews play within the research process and its practices, and is a substantive contribution to social, behavioural and health sciences research. It aims at incorporating rigour, validity and reliability when conducting literature reviews and presents seven steps on how to achieve this.

According to the authors, literature reviews should be systematic, defined ‘as a set of rigorous routines, documentation of such routines, and the way the literature reviewer negotiates particular biases throughout these routines’ (10). The authors acknowledge that this definition differs from the definitions of systematic literature reviews used in the health sciences. 

Instead, this book defines a comprehensive literature review (CLR) as an integrative review, being the combination of narrative review (i.e. theoretical, historical, general and methodological reviews) and systematic review (i.e. meta-analysis, meta-summary, rapid review and meta-synthesis). 

Seven Steps to a Comprehensive Literature Review purposefully addresses CLR as ‘a methodological, culturally progressive approach involving the practice of documenting the process of inquiry into the current state of knowledge about a selected topic’ (18). Additionally, the authors’ approach to the CLR takes into account the researcher’s philosophical stance, research methods and practices which, when combined, create a framework for collecting, analysing and evaluating the information that will form the basis for conducting a literature review. 

The book thus presents five types of information - MODES: namely, Media; Observation(s); Documents; Experts(s); and Secondary Sources - that help the researcher in their journey through the literature review landscape, which in the end will produce either a separate output or inform primary research within a bigger research project. 

Seven Steps to a Comprehensive Literature Review is an effective tool for an iterative process denoting a structured and chronological approach to conducting literature reviews. The book covers a range of research topics and practical examples arising from the authors’ own research including education, counselling and health systems research. Through these, the authors report an in-depth model characterised by a series of qualitative, quantitative and mixed research approaches, methods and techniques used to collect, analyse and evaluate data/information for the creation of new knowledge.

As its title suggests, the book is organised around seven sequential steps within three phases: the Exploration Phase includes Steps 1-5 (Exploring Beliefs and Topics; Initiating the Search; Storing and Organising Information; Selecting/Deselecting Information; and Expanding the Search (MODES)); the Interpretation Phase includes Step 6 (Analysing and Synthesising Information); and the Communication Phase includes Step 7 (Presenting the CLR Report). 

As the argument of the book develops, the differences between traditional literature reviews and the CLR become evident as the seven steps are unveiled. Traditional literature reviews are encapsulated within Steps 1-4, whilst a CLR goes further through the addition of Steps 5-7.

One of the steps that was of particular interest to me was Step 6 on analysing and synthesising information. The book advances research methodology knowledge and practice on the different elements of empirical data and how both qualitative and quantitative information can be analysed and synthesised to inform a CLR.

In Step 6, the authors go to great lengths to explain and exemplify how users can perform qualitative and quantitative data analyses of information, as well as the level of integration that can be achieved when doing mixed methods analyses. 

Additionally, the authors explore the nature of data analysis and identify three levels or layers that need to be taken into consideration: namely, the research approach (e.g. grounded theory); the research method (e.g. measures if regression); and the research technique (e.g. content analysis) used. This is found to be essential as data analysis is considered to be a product of the research method used, which in turn is linked to the research approach. 

Seven Steps to a Comprehensive Literature Review is not merely intended for those conducting a literature review, but it also works as a research methodology book as it addresses an extensive number of research methodologies, methods and techniques. The book offers a theoretically and practically informed discussion of increased integration of research processes, practices and products, raising important quality standards assurances necessary for a CLR, but also for research more generally. 

This is a very well-organised book which cleverly and effectively uses tables, figures and boxes throughout to illustrate and help contextualise detailed examples of the different steps involved in conducting a literature review.

Accordingly, readers seeking a tool or a guide on conducting literature reviews will find this a very helpful book. It will also be of use to a broader readership interested in research methodology more generally as it encompasses the different research traditions (qualitative, quantitative and mixed methods) as well as the stages of the research process (the research problem, the literature review, research design, data collection, data analysis and interpretation and report writing). 

For the reasons above, it will appeal widely to students, academics and practitioners interested in conducting literature reviews within the social, behavioural and health sciences. It is suitable for different levels of experience in conducting literature reviews and doing research in general. Furthermore, this is a book that should be at-hand and used as a guide each time one decides to conduct a piece of research that includes a literature review as it will provide new ideas and directions depending on the topic and disciplinary perspective.

Dr Ana Raquel Nunes is a Research Fellow in the Division of Health Sciences at the Warwick Medical School, University of Warwick, and a Research Methodologist and Adviser for the National Institute for Health Research (NIHR) Research Design Service (RDS). She is an interdisciplinary and mixed methods researcher working at the interface between public health, environmental science and social science. Her active interests include human vulnerability, resilience and adaptation to stresses and threats (e.g. climate change), housing and health, and fuel poverty. You can find more about her research here. 

Note: This review gives the views of the author, and not the position of the LSE Review of Books blog, or of the London School of Economics.

The Flipped Classroom Unplugged: Three Tech-Free Strategies for Engaging Students

active learningby , Faculty Focus: http://www.facultyfocus.com/articles/blended-flipped-learning/flipped-classroom-unplugged-three-tech-free-strategies-engaging-students/

Throughout this summer article series, we’ve addressed some of the most frequently asked questions about the flipped classroom in higher education.

We’ve shared ideas for student motivation, student engagement, time management, student resistance, and large classes. Since this is the final article in the series, I reviewed my notes and the findings from the Faculty Focus reader survey on flipped classroom trends (2015), and there’s one more topic we need to address:  creativity. 

“I don’t know if I’m creative enough to flip my class. How do you keep coming up with new teaching strategies and tools to engage students during class time?”

In almost every workshop I teach, at least one participant asks me this question. And, the findings from the Faculty Focus reader survey highlight the scope of this concern among educators. Almost 79% of the survey respondents indicated that “being creative and developing new strategies and ideas” was sometimes, often, or always a challenge when implementing the flipped classroom model.

By design, the flipped classroom model challenges you to plan activities and learning experiences where students focus on applying, analyzing, and evaluating course content during class time. It does take a certain amount of creativity to flip your classroom, but it doesn’t have to be intimidating. You can flip your class using simple strategies that allow for students to interact with the material and engage with each other.

For example, lately, I’ve been exploring the idea of flipping moments in our classes without using technology. What would happen if we got back to the basics with some of our activities and used everyday tools to engage students in higher levels of thinking? Would this help some of us overcome some of these feelings of intimidation and inspire us to be more creative? To start the conversation and get the creative ideas flowing, here are three “unplugged” flipped strategies you can add to your class to engage students. 

Flipped Strategy: Adaptation of Muddiest Point
Tool: Index Cards

“Muddiest Point” is a classroom assessment technique that allows students the opportunity to tell you what they are still confused or unclear about from the lesson (Angelo and Cross, 1993). Ask students to write their “muddiest point” on an index card. You may want to specifically focus their attention on the material from today’s lecture, yesterday’s lab, last night’s homework, or any other learning experience you want them to examine.

After your students complete the task, divide them into groups and tell them to analyze the cards based on some set of criteria. Ask them to look for patterns, common themes, categories, or outliers. Note how this adaptation of the Muddiest Point activity challenges students to move beyond just explaining what they don’t understand and into the higher levels of Bloom’s Taxonomy. They are now summarizing, sorting, analyzing, and evaluating the cards while looking for connections and themes.

Bonus idea: After students sort the cards, challenge them to find the answers together. If you want to keep things “unplugged,” tell them they can only use their textbook, hand-written notes, or other printed materials. 

Flipped Strategy: Mind Mapping
Tools: Sticky Notes, Whiteboard, Markers

Give each pair or group of students a stack of sticky notes and ask them to go to the whiteboard or chalkboard. Assign a topic related to the course material and challenge students to create a mind map of the topic using only their sticky notes. Explain that they can only put one idea on each sticky note, but they can use as many sticky notes as they need. Encourage them to use markers or chalk to draw lines and make connections between the ideas/concepts so you can see how their mind map is organized. By using sticky notes, it’ll be easier for the students to change their maps based on new ways of thinking.

Bonus idea: If you assign all groups the same topic, then you can ask them to rotate around the room and compare and contrast the different mind maps. You could give each group a different colored sticky note so they can add to another group’s mind map, almost like a gallery walk but with sticky notes. 

Flipped Strategy: Brainstorming Challenge
Tools: Pair of Dice, Worksheet


Give students a case study, question, or problem that benefits from brainstorming. Then, divide students into groups and give each group a pair of six-sided dice. Tell students to roll the dice, and whatever number they roll represents the number of answers they need to generate.

For example, if they roll a four and a five, they need to brainstorm nine possible solutions. If they roll a pair of sixes, they need to brainstorm 12 possible solutions. Give them a worksheet to record their ideas. Once groups have completed their challenge, ask them to switch their worksheets with another group and review their lists. This could be the beginning of a class discussion, or you could go another round and see how many more ideas students can add to another group’s list.

Bonus idea: At the end of this activity, ask students to review all of the ideas, select the top two best solutions, and justify their decision.

Hopefully these unplugged flipped strategies will inspire you to be creative in your own way. Your flipped classroom may not look like your colleague’s flipped classroom, and that’s okay. It’s not a “one-size-fits-all” approach. There isn’t one “right” way to flip your class. The most important takeaway is to use the tools and strategies that make the flipped model work for you and your students.

Thank you for following the series this summer. I hope I have addressed many of your questions about the flipped model, and I look forward to hearing from you! 

Now it’s your turn! What “unplugged” flipped strategies have you used in your classes to enhance student engagement? 

Resources

Angelo, T. & Cross, P. (1993). Classroom Assessment Techniques: A Handbook for College Teachers. 2nd edition. Jossey-Bass.

Honeycutt, B. (July 7, 2016). Three ways you can use index cards to FLIP your class: Another “unplugged” flipped strategy. Published on LinkedIn. Available online: https://www.linkedin.com/pulse/3-ways-you-can-use-index-cards-flip-your-class-barbi-honeycutt-ph-d-?trk=mp-author-card 

Barbi Honeycutt is the owner of FLIP It Consulting in Raleigh, N.C. and an adjunct assistant professor at NC State University. Her new book 101 Unplugged Flipped Strategies to Engage Your Students. Connect on Twitter @BarbiHoneycutt and on her blog.

Tuesday, September 20, 2016

The Neoliberal Assault on Academia: The Neoliberal Sacking of the Universities Runs Much Deeper than Tuition Hikes and Budget Cuts

Students are increasingly unwilling to take on massive debt for jobs they have little confidence of getting [EPA]
Students unwilling to take on massive debt for jobs [EPA]
by Tarak Barkawi, Al Jazeera: http://www.aljazeera.com/indepth/opinion/2013/04/20134238284530760.html

Tarak Barkawi is Reader in the Department of International Relations, London School of Economics

Story highlights

The New York Times, Slate and Al Jazeera have recently drawn attention to the adjunctification of the professoriate in the US. Only 24% of university and college faculty are now tenured or tenure-track.

Much of the coverage has focused on the sub-poverty wages of adjunct faculty, their lack of job security and the growing legions of unemployed and under-employed PhDs. Elsewhere, the focus has been on web-based learning and the massive open online courses (MOOCs), with some commentators celebrating and others lamenting their arrival. 

The two developments are not unrelated. Harvard recently asked its alumni to volunteer their time as "online mentors" and "discussion group managers" for an online course. Fewer professors and fewer qualified - or even paid - teaching assistants will be required in higher education's New Order.

Lost amid the fetishisation of information technology and the pathos of the struggle over proper working conditions for adjunct faculty is the deeper crisis of the academic profession occasioned by neoliberalism. This crisis is connected to the economics of higher education but it is not primarily about that. The neoliberal sacking of the universities runs much deeper than tuition fee hikes and budget cuts.

Thatcherite budget-cutting exercise  

The professions are in part defined by the fact that they are self-governing and self-regulating. For many years now, the professoriate has not only been ceding power to a neoliberal managerial class, but has in many cases been actively collaborating with it.

As a dose of shock capitalism, the 2008 financial crisis accelerated processes already well underway. In successive waves, the crisis has hit each pillar of the American university system. The initial stock market crash blasted the endowments of the prestige private universities. Before long, neoliberal ideologues and their disastrous austerity policies undermined state and eventually federal funding for universities and their research.

Tuition soared and students turned even more to debt financing. Now that bubble is bursting and hitting all the institutions of higher education that depend on tuition. Students are increasingly unwilling to take on massive debt for jobs they have little confidence of getting.

The upshot is to soften the resistance of faculty to change, in part by making people fear for their jobs but mostly by creating a generalised sense of crisis. It becomes all the easier for some academic "leaders" to be drawn up into the recurrent task of "reinventing" the university.

Here is the intersection with neoliberal management culture. Neoliberal managers thrive not by bringing in new resources - since austerity is always the order of the day - but by constantly rearranging the deck chairs. Each manager seeks to reorganise and restructure in order to leave his or her mark. They depart for the next lucrative job before the ship goes under.

One consequence is the mania for mergers of departments and faculties in the US and the UK. In both the university and corporate world, mergers are not only demoralising for staff, but they also break up solidarities and destroy traditions and make staff much more amenable to control from above. Such projects have little to do with academic excellence or even purposes, and often are self-defeating as the managers and the quislings among the professoriate who assist them have little idea what they are doing.

One of the only things the University of Birmingham was ever known for in the wider world was its Centre for Contemporary Cultural Studies. In 2002, the Centre was shut down by fiat in an act of vandalism described as "restructuring". The justification given for this was yet another neoliberal exercise then known as the Research Assessment Exercise, or RAE. 

In US terms, post-tenure review is an imperfect analogy for the salutary and depressing tale of the RAE. Invented by Margaret Thatcher's government, the basic idea is to rank all the departments in any one discipline and channel funding to the "best" departments, while cutting funding to the rest. The RAE was an assault on the basic idea of a university - the universe of knowledge - since universities would lose poor performing departments.

In neoliberal speak, this may sound very sensible. But imagine what happens to, say, physics and biology students, when, as the University of Exeter did, the chemistry department is shut down. Who will teach them chemistry? More to the point, how do you judge which is "best"? For this, the RAE needed the willing and active collaboration of the professoriate.

When I first held a UK academic post in the relatively early days of the RAE in the late 1990s, academics talked about it as if it were just some form they had to fill out, an annoying bureaucratic exercise that would not really affect us. Others, academic "leaders", saw it as an opportunity to do down their colleagues in other universities and channel funds to their own departments. 

Neoliberal assault on the universities  

In this way, the professors themselves helped to administer and legitimate a Thatcherite budget-cutting exercise. Worse, they participated in what they know to be a fiction: that you can rank scholarly research like you can restaurants or hotels so as to determine which departments have the "best" faculty.

Little more than a decade later - and now known as the Research Excellence Framework (REF) - this five-yearly exercise completely dominates UK academic life. It determines hiring patterns, career progression, and status and duties within departments. It organises the research projects of individual scholars so as to meet arbitrary deadlines. It has created space for a whole class of paid consultants who rank scholarship and assist in putting together REF returns.

UK academics regularly talk about each other's work in terms of whether this or that book or article is "three star" or "four star". Again, for those attuned to neoliberal ways of thinking, this may appear natural. But remember that the entire point of university research is conversation and contestation over what is true and right. In the natural sciences, as in the social sciences and humanities, one person's truth is another person's tosh, and valid knowledge emerges from the clash of many different perspectives.

Somehow, UK professors have become intimately bound up in administering and legitimating a government-run exercise that now shapes more of university life than they themselves do. They have actively ceded their power. US faculty need to keep this travesty in mind.

Something as apparently innocuous as an accreditation agency demanding that syllabi be written in a particular format, or majors justified in a particular way, can wind up empowering university management to intimately regulate teaching. A meaningless buzzword in the mouth of a dean, such as "new majority student", might in practice help legitimate the hiring of less qualified faculty. After all, if "teacher ownership of content" is old fashioned, why do you need to hire a professor who can create his or her own course?

The bottom line of the neoliberal assault on the universities is the increasing power of management and the undermining of faculty self-governance. The real story behind MOOCs may be the ways in which they assist management restructuring efforts of core university practices, under the smiley-faced banner of "open access" and assisted in some cases by their "superstar", camera-ready professors.

Meanwhile, all those adjunct faculty are far more subject to managerial control and regulation than are tenured professors. Aside from their low cost, that is one of the principal reasons why they are so attractive to university managers. 

Tarak Barkawi is Associate Professor in the Department of Politics, New School for Social Research. 

Source: Al Jazeera.

Wednesday, September 7, 2016

Two-Thirds of College Students Think They’re Going to Change the World

C3 College Students
College Students (Photo credit: Wikipedia)
by Lisa Wade, PhD, Cross-posted at PolicyMic, Huffington Post, BlogHer, and Pacific Standard, Sociological Images: https://thesocietypages.org/socimages/2013/05/20/college-students-aspirations-and-expectations/

Writer Peg Streep is writing a book about the Millennial generation and she routinely sprinkles great data into her posts at Psychology Today.

Recently she linked to at study by Net Impact that surveyed currently-enrolled college students and college-graduates across three generations Millennials, Gen Xers, and Baby Boomers. The questions focused on life goals and work priorities. They found significant differences between students and college grads, as well as interesting generational differences.

First, students have generally higher demands on the world; they are as likely or more likely than workers to say that a wide range of accomplishments are “important or essential to [their] happiness”:



In particular, students are more likely than workers to say it is important or essential to have a prestigious career with which they can make an impact.  More than a third think that this will happen within the next five years:



Wealth is less important to students than prestige and impact.  Over a third say they would take a significant pay cut to work for a company committed to corporate social responsibility (CSR), almost half for a company that makes a positive social or environmental impact, and over half to align their values with their job:



Students stand out, then, in both the desire to be personally successful and to make a positive contribution to society.



At the same time, they’re cynical about other people’s priorities. Students and Millennials are far more likely than Gen Xers or Boomers to think that “people are just looking out for themselves.”



This data rings true to this college professor. Despite the recession, the students at my (rather elite, private, liberal arts) school surprise me with their high professional expectations (thinking that they should be wildly successful, even if they’re worried they won’t be) and their desire to change the world (many strongly identify as progressives who are concerned with social inequalities and political corruption).

Some call this entitlement, but I think it’s at least as true to say that today’s college youth (the self-esteem generation) have been promised these things. They’ve always been told to dream big, and so they do.

Unfortunately, I’m afraid that we’ve sold our young people a bill of goods. Their high expectations sound like a recipe for disappointment, even for my privileged population, especially if they expect it to happen before they exit their twenties!

Alternatively, what we’re seeing is the idealism of youth. It will be interesting to see if they downshift their expectations once they get into the workforce. Net Impact doesn’t address whether these are largely generational or age differences. It’s probably a combination of both. 

Lisa Wade, PhD is a professor at Occidental College. She is the author of American Hookup, a book about college sexual culture, and a textbook about gender. You can follow her on Twitter, Facebook, and Instagram.

Voicing Writing: Exploring the Link Between Body and Text

open.abc.net.au
by Susan Carter, Doctoral Writing: https://doctoralwriting.wordpress.com/2016/09/07/voicing-writing-exploring-the-link-between-body-and-text/

During my own doctorate, I was troubled by voice and identity. As an undergraduate, I aspired to sounding like an academic; at doctoral level, it felt important to sound like myself. This post picks over some of the purposes of having doctoral students read their work aloud.

Most of us who support doctoral students with writing will repeat this handy bit of revision advice: ‘read the sentence out loud and you’ll hear when it is too long, or when the syntax is a bit skewy.’ It is the case that the process of voicing written prose will bring to light what’s going wrong in a way that helps revision.

Another ‘talking cure’ (to bounce of early psychoanalysis terminology) entails students talking through their research while someone writes down what they are hearing and asks questions when they don’t understand. Fairly commonly, students just can’t capture what is important in their research in their writing but can find it when they are talking to another human who probes them. I think it because authors expect that readers will see what’s important without them needing actual sentences spelling it out, whereas they often don’t.

We can advise, ‘remember your audience’s needs’, but with the talking cure, the audience (think ‘reader’) has become real. Their needs are real. The researcher is no longer groping round in thickets of big words but is back helping another human to grab hold of the significance of their work.

Again, quite commonly we all as writers find it hard to actually spell out a clear articulation of what our research means or why it matters. Being able to do so in simple language is hugely empowering - if we can help doctoral students to do it, we make their survival as researchers a lot more likely.

Usually, then, as a supervisor I’ll ask students to read their writing aloud for the reasons of enabling authorial clarity and to foster thinking by asking questions when I suspect that there is more to be said. But in this post, I also want to speculate more on the relationship between voice and identity.

The term ‘voice’ is used for the sense of authorial individuality captured in written prose. It is often hard to achieve, because on one hand it must demonstrate that the writer is aware of genre and discipline expectations, and on the other, that the writer has engaged with any contentions in their field and have positioned themselves defensibly in relation to them.

Claire Aitchison has posted on using voice recognition software that writes what you say so that you capture an embodied and voiced version of your thoughts. Claire finds that she likes the spoken-aloud version of her own writing better than the one produced by her fingers on the keyboard. There’s something going on with that.

I’m speculating that this is due to her sounding more like herself, like the Claire who talks in all kinds of situations, and in quite different roles with a range of people (observing many different genres). Maybe talking aloud serves another purpose: staying more true to the self that you are holistically, both in and out of academia.

Are others attracted by the possibility of developing a holistic voice that captures an author as they would be recognized by their friends and family outside of academia? Those people we live our lives with don’t hear us talking in abstract theory.

I want to suggest that talking also lets you hear when you are using theory in a way that is true or untrue to the ordinary talk of your background. For some of us, this alignment factor feels important, and /or it may be important because we are writing from a theoretical position, as a woman using feminist theory, say, or an indigenous author using post-colonial theory for the purpose of ‘decolonising’ (Smith 2012).

When I wrote my PhD, I was mature, with life and work behind me that gave me a self who was known by friends, neighbours, previous work mates and family. I wanted to become an academic, but I didn’t want to sound pompous. Pomposity can seem a real risk for doctoral writing. Ok, there is nothing ‘natural’ about written text, so that the idea of an authentic voice is naive, but the textual construct of academic persona, I felt, should be bear some recognition of the embodied writer.

In my case, I couldn’t chase after the feminist theory that attracted me to the extent that it wasn’t true to who I was, in this case, happily married to a bloke. I can’t remember the sentence, but I do remember reading one of my sentences aloud and recognizing I just could not use it. It was a well-written, theoretically-interesting academic sentence that took some ideas I believed in to their logical conclusion, but I would feel an idiot reading it to some of my mates. My own life as I had lived it wasn’t predicated on theory.

The sentence had to go, and I had to find a way to be sharp in academic terms, but within the scope of who I was as a whole person. This tangle with theory and voice induced one of those mini-identity crises that accompanies education learning that is transformative.

And I think that often doctoral students who are in the process of transition but have not actually found themselves an academic voice struggle with pulling their ordinary-world self into alignment with their academic voice. Perhaps that is what feels uncomfortable.prompts in the process of teaching and learning.

So I’m suggesting here another use for asking doctoral students to read their writing aloud. It can be empowering for doctoral writers who want to build an authorial voice that speaks their holistic self into being within academia.

Does your experience tell you that doctoral students commonly grapple with a comfortable good-fit academic voice?