Friday, January 31, 2014

Is It All Writing?

Line art representation of a Quill
(Photo credit: Wikipedia)
by Rachael Cayley, Explorations of Style:

Today I’d like to write about a topic that I find perplexing: what is the best way to define the term ‘writing’?

Should we use writing as an omnibus term for every aspect of creating a text?

Or should we use it more narrowly to refer to the initial act of getting words down on paper?

Undoubtedly, we all do both, depending on context.

Sometimes we think of writing as a soup-to-nuts term for everything from conception to publication, and other times we think of it simply as the moment of composition, distinct from both planning and revising.

While I’m far from consistent in my usage, I know that my tendency is to use the term broadly. Is this just a lack of precision on my part or is there a benefit to being inclusive in the way we define writing?

When I hear myself offering a broad definition of writing, I’m often reminded of a mama-and-baby yoga class that I attended when my first child was born.

This class was full of babies nursing, babies getting changed, babies learning to crawl, babies being irresistible, but it wasn’t full of anyone doing yoga. And the teacher used to say, as each class would finish without any actual yoga having been practiced, “It’s all yoga!”. Which of course it wasn’t. It was good and yoga is good, but that didn’t make it yoga.

In using a broad category of writing, we may be engaging in a similar sort of self-serving inclusivity. Sorting my sock drawer? Well, I can’t write with cold feet and I can’t find my favourite socks and … it’s all writing!

In a post last year on not-writing, I talked about ways that not-writing can overwhelm our attempts to write. Needless to say, allowing ourselves to define writing too broadly can hamper our productivity.

But is there any benefit to including planning and revising - both obviously essential steps in the creation of a text - in our concept of writing?

To my mind, the benefit of thinking of writing broadly is that doing so may allow us to deepen our commitment to planning and revising.

When we think of writing narrowly, we are naturally treating it as separate from planning and revising. And if that separation works well for you, that’s exactly what you should do.

For some writers, however, treating writing as a category that includes a broader range of activities can be a helpful strategy for dealing with persistent writing difficulties. If we think of planning as a species of writing, we can then use writing as a way of clarifying our own thinking.

When we hold off writing in order to plan what we need to say, some of us will flounder. Being stalled in the pre-writing stage is pretty common in the students that I see; I often see writers who have pages and pages of outlines and sketches, but who don’t feel ‘ready to write’.

I’m not saying that writing is the only solution, but I know that writing generates writing. Starting early may confirm that you are in fact not ready, but it also may generate the text that you need or may lead you to a better understanding of your own topic.

Similarly, if we think of revising as species of writing, we can then use writing as a tool for extensive revision.

When we think of revision as distinct from writing, we may be less likely to engage in the sort of vigorous revision necessary to move from first to final draft.

That is, when writing is seen more narrowly, revision can be seen as conceptually different from writing, making it more likely to become a limited project of cleaning up mistakes. That limitation shuts off the possibility of using rewriting as a way of radically strengthening a text.

Overall, if we use early writing as our way of figuring out what needs to be said and late writing as our tool for reshaping our text into the most suitable form, we are more likely to break out of the insularity of our own internal thought processes.

The act of writing always anticipates the public. By framing all our writing activities as writing, we may give ourselves greater access to the power of writing to organize and reorganize our thoughts.
Enhanced by Zemanta

Social Sciences Research is Riding High, But is it MOOC-Proofed?

Iconic image for social science.
Iconic image for social science (Wikipedia)
by Patrick Dunleavy, Impact of Social Sciences:

With four fifths of economic value-added found in services, the UK is now primarily a service economy.

This is great news for social science disciplines who have demonstrated a strong influence in these industries. 

Whilst there are glimpses of optimism, argues Patrick Dunleavy, vulnerabilities still remain. 

Given that only one in nine of the 30,000 social science researchers work in research-only jobs (compared to one in three in STEM disciplines), the social sciences must respond to the advent of digital disruption in more dynamic ways.

This piece originally appeared on The Conversation and is reposted below under CC BY-ND.

The social sciences can now be seen as substantial UK industry, worth £23.4bn a year in broad economic terms according to my research.

But subjects such as politics, economics, business, law and sociology are not being given due recognition for their contribution to the UK’s service economy and labour market. The direct spend on university social science comprises just above a tenth of this total, coming out at £2.7bn a year.

But we need to also look at the indirect economic benefits of social science departments procuring goods and services, and at the multiplier effects of social scientists’ wages on the rest of the economy.

As new research for a book, The Impact of the Social Sciences shows, these increase the total contribution to the economy of university social science to £4.8bn a year.

And new analysis by the Times Higher Education also shows that social science student numbers, and hence staff numbers have been growing consistently.

The remaining £19.4bn a year of impact is comprised of spending by firms, government and public sector agencies, non-governmental organisations and the media who employ some 380,000 post-graduate qualified social scientists in professional and analysis occupations.

Because of limitations in the labour force statistics, we can only get a conservative view of how much these other organisations are spending on translating and mediating social science research.

pd socsci fig 1
Translators of social science. Cambridge Econometrics, for LSE Impacts Project. Numbers rounded

The two biggest groups are nearly 180,000 professionals in government and public services (costing £8.7bn a year); and 170,000 analysts and research translators in finance institutions and the banking industry (costing £9.8bn a year).

Our research also identified 40,000 professionals working in the consultancy industry (costing £1bn a year, half of which goes in helping the public sector).

Previous research on the careers of social science graduates found that 3.5 years after graduation, 84% were in employment, compared to 78% of science graduates.

STEMming the flow

Two basic factors underlie the booming social sciences sector. First, the UK is essentially a services-based economy. Four fifths of economic value-added is now in services.

Social science disciplines connect closely with services industries in many different dimensions, while most of the efforts from the science, technology, engineering and mathematics (STEM) disciplines in the UK are still trying to focus on a shrinking UK manufacturing base.

Some STEM research plays a key role in “productising” forms of services, such as websites which allow users to book flights without relying on travel agents, or devices that let people check their blood pressure at home without needing health professionals.

But even here, social science knowledge is key in finding what does or does not work (for example, what types of people can or can’t reliably check their own blood pressure).

Second, we live now on a globalising and intensively investigated planet, where human-dominated systems (such as cities, markets, states, physical and digital networks) increasingly constitute the focus of many concerns.

In studying these systems, social scientists are converging with the most applied STEM disciplines - especially medicine and health sciences, IT and software engineering.

Equally important has been the rising importance of social science in studying “human-influenced systems”, a broad category that now includes virtually all other processes across the globe. With rising human populations, almost everything earth-bound is now human-influenced.

Think, for instance, of how even global climatic systems are responding to fossil fuel emissions, and of how closely any mitigation efforts depend on understanding social, political and economic dynamics.

The old polarisation of social science versus STEM disciplines is withering away fast. This change has accelerated recently as the social sciences in the digital era also incorporate and adapt key STEM science methods - such as analysing big data, using more randomised control trials and experiments, and more systematic review.

pd socsci fig 2
Estimated value of research grants and contracts to UK universities in 2010-11 Impact of the Social Sciences

Yet there is still a key potential vulnerability. Both government and private sector and charity funding of research are still heavily skewed towards STEM sciences, which receive four-fifths of research funding, according to our research.

The UK government has been in the grip of dated “techno-nationalist” misconceptions of the sources of economic progress. And the UK private sector focuses often on short-term “bottom-line” factors and things that give an individual comparative advantage to firms.

This is not an area where social science (with its collective-research progress mode) can offer “discovery” breakthroughs or patentable advances. As a result, the social sciences get just over a sixth of the amount of total research funding that goes into STEM.

Generous, secured funding means that over a third of the 67,000 STEM sciences researchers in the UK work in research-only jobs, where they can focus their whole energies and activity on advancing knowledge.

By contrast only one in nine of the 30,000 social science researchers has a research-only job - the huge majority must combine research and teaching.

MOOCs are not the end

Some pessimistic observers have argued that the advent of massively open online courses (MOOCs) could begin to heavily erode the numbers of people involved in university teaching over the next decade.

If this happened, it could hit STEM disciplines hard, where 65% of researchers also teach, but social sciences harder - because 89% of their researchers and departments rely on teaching for their basic incomes.

Yet the significance of MOOCs remains uncertain. Any MOOC effect is likely to be complex, focusing mainly at the sub-university level, likely to produce an upgrading of university start levels, and to actually result in more research-focused undergraduate learning than in the past.

Fundamentally, MOOC doom-merchants are operating with a non-dynamic model of what society needs and gets from education and research.

If we can begin to do simple things more cheaply and more quickly - for instance, draw demand and supply curves, or appreciate the difference between a mean and a mode - we will move on very fast to try and do vastly more complex things, which we hitherto accepted as beyond our control.

That has been the number one lesson of the digital era, and it will continue to be true whatever the scale of MOOCs’ effects. In the contemporary development of human-dominated and human-influenced systems, the social sciences have a secure and increasingly salient role.

LSE Public Policy Group, which Patrick Dunleavy chairs, received funding from the Higher Education Funding Council for England (HEFCE) for the research reported here.

This article was originally published at The Conversation. Read the original article.

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Patrick Dunleavy is Co-Director of Democratic Audit, Chair of the LSE Public Policy Group, and a Professor of Political Science at the LSE.
Enhanced by Zemanta

Do Things Ever Change in Education?

educationpixby Elizabeth Jamison, Dissertation Gal:

In my continuing research for my dissertation, I came across the first issue of The English Journal (1912) - publication information at the end of this post.

Let me say that first, it’s fascinating to read articles and essays from over a hundred years ago; and second, it’s disturbing to learn that the issues teachers and professors were going through then are exactly the same (with a few minor details changed) today!

Haven’t we learned anything?

By 1895, the subject of educational values was ubiquitous in academia. You might ask why this was important.

Well, before the mass influx of immigrants and the massive increase in America’s workforce, the idea of education was for those who didn’t have to work for a living.

Yet by the turn of the 19th century, there was a growing need for literate workers, not only in professional fields but in ALL fields. The need to convey thoughts in a concise and clear way was crucial if businesses expected to thrive.

The Dial, a literary journal founded in 1890, recognized that the tide was turning, claiming that “the very fact that educational values [were] being everywhere earnestly discussed [was] itself of the highest significance” (229).

And yet, as late as 1910, despite the ever-increasing demand for a literate workforce, scholars worried about the implications of an educated society. They wondered who would do the “rough work” (American Educational Review XXXI 4).

At this point in time, college was a relatively new concept for the middle classes, and parents would often send their kids to college if they were perceived as being lazy.

By the early twentieth century, however, educators knew that they had to dispel this myth and sell “college” to the masses in order to prepare the next generation for the demands of science and technology: “By turning to the rhetorical texts of Scottish theologian George Campbell during the American Revolution, then to those of the Presbyterian author Hugh Blair and British theologian and political philosopher Richard Whately, educators at universities such as Harvard could instill a frame of reference in students that urged them to see language as a vehicle for action” (Elliot 2).

This brings to mind writing for a purpose, not just for the sake of writing.

Over a hundred years ago, the specific needs of a culture demanded a certain kind of writing - much of it technical and all of it concise and correct - and if we examine civilization throughout the 20th century and into the 2000’s, we will see that shifts in belief systems, work ethic, economics, and technology regularly called for a constantly evolving method of communication, writing, and of course writing assessment.

And yet, our systems of standardized testing, especially on high-stakes tests, have changed very little in the past 65 years. Why is this?

As the teaching of composition became increasingly important in the early twentieth century, the problems that arose grew as well. Teaching writing was difficult in an age of uneducated students who came from a myriad of backgrounds and cultures. Adding to the problems were professors who were overworked, under qualified and underpaid.

In 1912, the very first issue of English Journal discusses the problems inherent in teaching composition.

In his essay titled “Can Good Composition Teaching Be Done under Present Conditions?” Edwin M. Hopkins, from the University of Kansas, answered his own question and claimed that it could not:

“A single statement will explain the fundamental trouble. Not very many years ago, when effort was made to apply the principle that pupils should learn to write by writing, English composition, previously known as rhetoric, became ostensibly a laboratory subject, but without any material addition to the personnel of its teaching force; there was merely a gratuitous increase in the labor of teachers who were already doing full duty” (xviii).

It’s amazing to note that over a hundred years ago, teachers of composition were lamenting their situation, which was the same then as it is now, and begging for improvements.

Some of the primary problems with teaching composition in the early 2oth century was the uneven ratio between composition students and teachers.

Because there were so many new students learning how to read and write and so few composition teachers, those teachers were overworked, tired, underpaid, and frustrated.

Hopkins also noted that although the need for composition classes had in fact skyrocketed, the hiring of new teachers to fill that need had not occurred, leaving the already overbooked literature teachers struggling to take on a second job.

To make matters worse, in most instances all other teachers in different departments were well funded and were able to actually go home when the day was done, whereas composition teachers were expected to grade papers 24/7.

The inequities of teaching composition led of course to the failure of students to adequately write, and thus the need for change grew.

It is no wonder that the United States began to search for standardized assessments that wouldn’t kill teachers and that would also - in at least a superficial way - measure students’ writing ability.

Today, students are pressured to achieve so much more than even 20 years ago. I mean, just getting accepted into UGA now is highly competitive, when in the 1990′s it was “state school” and pretty much a sure thing if you were able to follow directions and pass your high school classes.

Now, students have to rank at the top of their class, take multiple AP courses, be leaders in clubs, and contribute to their community through extra service. Wow! No wonder we can see evidence of high blood pressure and anxiety in students now more than ever before. My son’s stressed out and he’s 11. That’s NOT acceptable.

What do you think? Do you have a child in school or are you going through it now? I’d like to hear your thoughts.

Hopkins, Edwin M. “Can Good Composition Teaching Be Done under Present Conditions?” The English Journal 1.1 (1912): 1-8. JSTOR. Web. 30 Jan. 2014.

Thursday, January 30, 2014

Writing the PhD: Motivation? It’s TOTALLY Gone!

Doctor ...of Philosophy
Doctor ... of Philosophy (dullhunk)
by Arjenne Louter, The Dutch PhD Coach:

Do you have that as well? That sometimes your motivation seems ABSOLUTELY gone?

That you don’t have a clue anymore why you started your PhD in the first place? That you would actually prefer that you never started?

Well, you are actually not alone. I honestly don’t know any PhD-student who has not considered quitting. What can you do if you feel like that? One tip that will work for sure.

wanting to quit...

If you don’t feel well, if you don’t want anymore, if there is no perspective, if you prefer to stay in bed the whole day under a blanket or run away from your computer as fast as possible, slam the door of the lab behind you to never come back again? If you feel that way, I have a tip for you: write a letter.

Write a letter? Exactly. And to make it even more specific: Write a letter to yourself. Describe in the letter why you started your PhD, and why you would like finish it. You can answer the following questions:
- Why did you pursue a PhD?
- What will you do after your PhD?
- What will getting your PhD bring you?
- What will your life look like when you have completed your PhD?
- What options do you have now because you're a PhD-student (which you might not have otherwise)?
- What options and possibilities do you have after your PhD?
- What are the most enjoyable parts of your PhD?
- What inspires you in your PhD?
- How will you feel if you actually finish your PhD?

It might feel a little weird, writing a letter to yourself, but by actually doing it, it often becomes clear again why you ado what you do. And please make sure you have the letter close at hand, so you can re-read it when things are tough. It will work, for sure.
Enhanced by Zemanta

How Can We Prepare University Students for the Real World?

English: Tiffin University students
New University Students (Wikipedia)
by Nicolette Lee, Victoria University

New students entering university this year will embark on a path that will require a great deal of emotional and financial investment.

The pay-off they expect is not just the experience or entry-level skills and knowledge, but also the chance of a better future for themselves and their families.

But a recent report by the Australian Council for Educational Research (ACER) shows that our universities are failing to deliver the higher education experience that students are demanding, especially in providing professional capabilities that will equip them for their careers.

Statistics from the 2012 Australasian Survey of Student Engagement (AUSSE) show that only 37% of students in later years believed their experiences had contributed very much to their development of work-related knowledge and skills.

Only 27% felt strongly that their studies have prepared them to work effectively with others. Just 24% felt their studies have contributed very much to their ability to solve complex, real-world problems.

Some results from the 2012 Graduate Course Experience Survey are just as problematic. Nearly half of all bachelor graduates felt university staff seemed more interested in testing what they had memorised than what they had understood.

In other words, the “ramp” we are giving our students may not launch them very far at all when it comes to their employment prospects. We need to do better than teaching students to memorise information if we are to give them the best chance to succeed.

What is the sector doing about it?

Universities are well aware of the work to be done here, with increasing interest in the idea of the capstone learning experience in undergraduate degrees.

While there are probably as many variations of capstones as there are capstones, they are broadly defined as final-year learning experiences that require students to synthesise knowledge and skills gained over the degree.

This leaves a lot of scope to deliver more of the same. Poorly designed capstones attempt to test all the knowledge that has been delivered in the course, or micro-manage tasks to the point that students make no decisions of substance and are trapped in a lock-step process.

Both approaches, of which there are many examples in Australian higher education, simply confirm students’ dependency on us and fail to deliver the experience that students need.

Well-designed capstones, on the other hand, share a common toolbox of learning activities involving relevance, complexity and independence. They require students to deal with complex situations as they would in industry or research.

Students take on a challenge and plan how they are going to tackle it; figure out what they don’t know and what to do about it; set the schedule and work with others; and deal with things that go wrong by fixing them.

Great examples include those such as the graduating project in Arts at Victoria University. Students from across disciplines select a problem or issue, and then define and complete an in-depth project that extends the skills and knowledge that they have learnt throughout the degree.

Some students work with community groups, others work on projects defined by an awareness of an issue and grounded in research. Throughout the process they have to negotiate and make decisions, and take responsibility for their learning as well as the ethical dimensions of working with others.

In education these are not new concepts. They build on the inquiry-based education movement of a century and the work-based learning experiences of a millennia.

These experiences build capabilities that make all the difference: intelligent responses to challenges, creativity, responsibility and resilience.

Capstones: something worth sharing

All of these things, of course, can also happen in any active learning experience at any level. Progressive school education operates on just these principles.

Capstones are different because they build on high-level knowledge and skills acquired over a degree. They can rigorously test whether students can integrate and manage all the skills, knowledge and capabilities they have acquired, and whether they can do so in context.

Great capstones can drive student aspiration and, as a result, retention. An exciting challenge in the final year can be something that students look forward to - and fight to stay for - especially if it gives them a springboard to the ambitions they hold.

Pragmatically (or cynically, depending on your point of view), great capstones also have an almost immediate effect on graduate satisfaction. The most recent experience students have makes a big difference to their perception of whether the degree was worth doing at all.

Capstones demonstrate the point of a degree. They build a bridge between the course and post-graduation. They also provide graduates with concrete examples of what they can achieve independently that can be shown to prospective employers.

The challenge facing universities is to make the most of the opportunity that great capstones provide.

It takes time and money to do it right, but it is less than the cost of having students graduate feeling disappointed and unprepared. If we are doing this at all, we owe it to our students and ourselves to do it well.

Support for the development of this article has been provided by the Australian Government Office for Learning and Teaching through a National Senior Teaching Fellowship. The views expressed in this article do not necessarily reflect the views of the Australian Government Office for Learning and Teaching.
The Conversation

This article was originally published at The Conversation. Read the original article.
Enhanced by Zemanta

Social Scientists Provide Valuable Insight for the Private Sector into How People Live and Interact with Technology

Futuresonic - Laptops, Laptops
Laptops, laptops (loscuadernosdejulia)
by Jeff Patmore, Impact of Social Sciences:

Ahead of tonight’s panel discussion on Engaged Social Science: Impacts and Use of Research in the UK, panellist Jeff Patmore takes a look back at how social sciences have influenced his work in telecommunications over his years in the private sector. 

The insights from social scientists on how people live their lives and interact with the continually changing landscape of technology allowed him to understand the impact of modern technologies and to help teams to design better technological solutions.

This is a question which has come up a number of times over the past few weeks , firstly at a meeting at the London School of Economics, then with a colleague at Goldman Sachs and more recently at a meeting with a Cambridge Experimental Psychologist, in the Department of Engineering.

Conversations with these people caused me to think about my own work with social scientists over the last decade. Had the interactions been critical to my thinking? The answer is yes, a resounding yes, but why?

My first real, and important, experience of working with a social scientist was in 2004 when I was introduced to a young social anthropologist, Xiaoxiao Yan, who had decided that her PhD research would focus on the impact of Broadband communications on the UK and China.

Xiaoxiao had a longstanding interest in the relationship between culture and technology and had been investigating the impact of broadband technology since 2002.

Her supervisor, Alan Macfarlane, suggested that I might be interested in supporting her research and explained that her field work in China could provide me with valuable insight; the study a small start-up company in Beijing.

She told me that her research would focus on how the company, in its very early stages, would build an on-line customer base for its product by developing a trusted relationship with its customers through Chinese bulletin boards and IM.

The company sold refurbished second-hand laptop computers and its founders were keen to fulfill an ambition of entrepreneurship - a dream shared by many young Chinese people at that time.

Image credit: (CC BY-SA)

I agreed and she spent 6 months working with and documenting the progress of this start-up in Beijing which, at the time, comprised just three people.

When she returned to the UK she provided me with a video documentary, subtitled in English, which provided a rare insight into this new company and the way they worked with their customers [the full video can be seen here].

Having spent quite some time living and working around the MIT campus I was struck by how similar the young entrepreneurs in Beijing were to those in Cambridge, Mass.

They started their business while still students at Tsinghua University and when it grew too large to run from the campus they moved across the road into a building where they could rent just enough floor space and communications for their needs.

This was very much like the “Cambridge Innovation Centre” opposite the MIT business school at No1 Broadway (strap line - More Startups Than Anywhere Else On The Planet).

The study of the Chinese entrepreneurs provided real insight and understanding of the culture of the young Chinese coming out of their best universities and I have to admit was something of a wake up call at the time.

A secondary piece of insight which again surprised me was the speed at which the people in the company could communicate with their customers, using a combination of Instant Messaging and Pinyin, where the IM system quickly converted the typed Pinyin into Chinese characters for the opposite party to read.

I had assumed at the time that using Chinese charters in communication would be slow and laborious, but by using these technologies the Instant Messaging was actually much faster than if they had been using English!

The IM system ‘guessed’ the Chinese characters as the user typed the Pinyin and then provided a number of characters for the user to choose from (see: Wikipedia page on this).

Between 2006 and 2008 I worked with a remarkable young social scientist based at Victoria University Melbourne AustraliaNatasha Dwyer. She was studying for her PhD on Trust in Digital Environments. While working with me she produced two great reports:
  1. Dwyer, N (2008), Research Report: Towards Trust-Enabling Technology, Traces of Trust. Her opening paragraph:
    Human communication consists of a wide range of verbal and nonverbal cues. Nonverbal cues can communicate intent, meaning and subtleties and have an intrinsic value. While digital technologies have dramatically enhanced the amount of opportunities of individuals to interact over space and time, these technologies do not contain this rich world of data people receive in the ‘physical world’ to help them make their decision. Thus when people make a trust assessment in the digital space they are not privy to the usual contextual cues used to interpret information.
  2. Dwyer, N (2006), Research Report: Trust and the Young Digital User: The Significance of the Trustee’s Intention, Motivation and the Gift Exchange Process. In this paper she makes these key points:
    As the ‘early-adopters of technology’, young people (16 -25 year olds) are defining how technology is understood in our society. New ways of gathering, critiquing and retaining information are being established - the ‘web generation’, the first generation to grow up digital, take on ‘new technology’ - these value systems regarding digital communication affect how the participants perceive trust, relationships and exchanges in the digital domain.
Her examination of trust and privacy in the environments enabled by new technologies was quite groundbreaking and allowed me to really understand the nature of trust in our ‘always on’ new world and how this new generation of people who have ‘grown up digital’ perceive our world.

In 2008, after noting how quickly Facebook uptake was increasing, I arranged for a social anthropologist at Cambridge, Kathleen Richardson, to lead on a short research project looking at how Facebook might encourage people to be more sociable and to act as a form of personal biography.

Additionally she examined the nature of ‘friends’ on social networks and how these differ from conventional friendships.

She worked with a British Telecom social anthropologist, Sue Hessey and together they produced a paper; “Archiving the self? Facebook as a biography of social and relational memory”, which was published in the Journal of Information, Communication and Ethics in Society.

A key finding was that social networking sites can bolster past and weaker tie relationships as well as strengthen stronger tie ones.

They also found that there appeared to be ‘rules’ developing around Facebook use, for instance, people had to have met a Facebook friend at least once physically before they were accepted as a friend on the social network.

Another interesting finding was that participants rarely interact with the majority of their Facebook friends and,
it is this dormant archive of relationships that hold the most interest as it provides an archive of relationships that would have dissipated without these technologies.
Again social scientists had provided real insight into how people interact with new technologies.

Since 2008 I have worked closely with many social scientists across a multitude of disciplines and their knowledge and insight has allowed me both to understand the impact of modern technologies and to help teams to design better technological solutions for people.

Recently two global research initiatives have demonstrated the strategic value of this type of work.
  1. July 2011: "Culture, Communication and Change: An investigation of the use and impact of modern media and technology in our lives". The research project, led from the UK, was carried out in collaboration with teams in the US, China and Australia and it provided a unique view of how people are using communications technology in their lives at home and at work, whether young or old. It was covered in more than 200 media articles globally including slots on BBC Breakfast and BBC News at 6.
  2. November 2013: "Conversations, Conferencing and Collaboration: An International Investigation of Factors Influencing the Effectiveness of Distributed Meetings". Modern conferencing technology makes it easy to have meetings with internationally distributed teams at any time, anywhere. These meetings can reduce both travel time and cost, however it is matter of some debate whether these meetings can be as effective as meetings held face-to-face. This report examines the phenomenon in teams from the UK, US, Australia and China and produces recommendations on how these meetings can be more effective.
Social scientists are able to provide a unique and extremely valuable insight into how people live their lives and interact with the continually changing landscape of technology.

They can also show us how our human relationships are changed by technologies, for example our friendships on social networks.

But they can also provide us with insight into bigger global phenomena such as how trust and privacy are perceived, something which in this digital age is of great importance to everyone.

This piece originally appeared on Jeff Patmore’s personal blog and reposted with permission.

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Jeff Patmore was previously head of BT’s Strategic University Research & Collaboration programme, directing programmes of work with some of the world’s top academic institutions and collaborating with some of the best technology and business minds in academia. 

He has over 30 years industrial experience in information and communication technologies and currently his principal fields of interest are the management of innovative multi-disciplinary teams, knowledge sharing and open innovation. 

He is a Fellow of the RSA, and has previously been a Director of Young Engineers for Britain and a contributor to the Cambridge-MIT Institute. Following his retirement from BT in 2011 he has maintained his links with the University through membership of Pembroke College and he works closely with the Engineering Design Centre in the Department of Engineering.
Enhanced by Zemanta

Wednesday, January 29, 2014

(Federal Education) Minister Pyne Fails Another Test

Jackeroo Peter Barry shoes his horse
Jackeroo Peter Barry shoes his horse (Photo credit: State Library of Victoria Collections)
by Ian Keese, Online Opinion:

On January 20 Minister Pyne was given 750 words in the Fairfax Media to justify his Government’s review of the Australian Curriculum and I was looking forward to being provided with some good educational reasons for carrying it out.

I certainly agreed with the heading ‘Politics have no place in a curriculum review’ and his comment that ‘Partisan politics is at its worst when dressed up as public concern’.

Unfortunately most of the article was partisan politics at its worst, without any reference to the many positive aspects of the process undertaken by ACARA under the leadership of Professor Barry McGaw.

The process, which was initiated by the Howard Government, did move to tight deadlines and it was brought to completion in a remarkably short time considering all the stakeholders involved.

However the process was never ‘rushed, ad hoc, stop go’ as Mr Pyne claimed. I challenge Mr Pyne to provide evidence of this. At all stages the process involved wide consultation with a variety of experts, classroom teachers and the public, all of whom represented a broad cross section of political positions.

Partisan politics was also obvious in Mr Pyne’s appointment of two people who have been happy to criticise from the sidelines and who represent the views of a vocal minority.

Of course their views should be considered, among those of others far closer to students, with far more academic standing in education and representing other points of view.

The claim that his Government has a ‘mandate’ also demonstrates a very poor understanding of the democratic process.

While in a ‘democratic dictatorship’ it is winner take all, there are many reasons for this not being true in a genuine representative democracy. In this particular case the Coalition Government, under a leader who the electorate never really warmed to, essentially won by default as Labor self-destructed.

Secondly no one elector essentially agrees with all policies proposed by a potential government. After an election we always have to take the good with the bad but this does not take away our right to be critics.

Finally any ‘mandate’ has to be won day by day, and the evidence of opinion polls so far is that the Coalition  Government has lost whatever mandate it had fairly quickly, and it is likely that the initial breaking of promises on the Gonski Review followed by the wide variety of positions held on school funding within a fortnight have played a significant part in this.

We still do not know exactly how the financing of the Gonski Review is to be worked out, but Trevor Cobbold has argued ‘the government is using states' rights as a pretence to guarantee funding increases for private schools but not public schools.’

Of course the curriculum is far from perfect, and I have been a vocal critic of some aspects of the history curriculum, particularly at the secondary school level.

A few topics on widely spaced historical periods of a few Asian nations has been scattered through the curriculum, with minimal compulsion to teach any of it, but the worst part has been the failure to successfully integrate modern Asian History into a world perspective.

As well 15% (nearly two thirds of a year in a four year course) is devoted to World War I and World War I, while only half this time is devoted to the crucial years of Australian History from European settlement to the early years of the Federal system.

But despite these criticisms I am excited that through the History Curriculum students all over the country will have the opportunity of having a shared knowledge of the contemporary world and how, over thousands of years the world they live in came into being.

Any educator will find particular aspects of the curriculum to criticise, but in a democracy any curriculum has to be a compromise.

Teachers have spent hundreds of hours, much in their own time, preparing teaching programs and resources; publishers have invested in textbooks and IT resources; conferences have been held over all Australia working out how teachers might implement the curriculum in the best interests of their own students.

Plenty remains to be done including have enough qualified teachers to teach it. The best thing for Minister Pyne to concentrate on would be to focus on the real issues and on directing resources to supporting teachers rather than undermining all that has been done so far.

Let us see how the Australian Curriculum works in practise and then we will be ready for a thorough, widely consultative, review.
Enhanced by Zemanta

‘Value for Money’ Rhetoric in Higher Education Undermines the Value of Knowledge in Society

A depiction of the world's oldest continually ...
Historical Depiction of University of Bologna (Wikipedia)

Over the past 15 years, reiterated across successive governments, the concept of value for money has been internalised throughout the higher education sector.

Joanna Williams outlines the reasons why it is problematic to use student choice and value for money as a means of holding universities to account.

Universities should be concerned with knowledge not skills; and intellectual capital not economic capital. Seeing the university as a financial investment in employability skills undermines the authority and value of knowledge.

That fee-paying students should seek value for money in their purchase of a degree has become accepted as common sense. 

Young people are encouraged to perform complex calculations whereby they subtract tuition-fees paid and income lost while studying from potential future lifetime earnings, in order to determine whether or not their investment in higher education will be worthwhile. 

Not only do such crude economic calculations make no sense at all, but worse, they encourage potential students to see a degree as a financial investment. 

This degrades higher education. Immersing yourself in your chosen discipline, critiquing existing knowledge and advancing new knowledge are all missing from economic calculations. 

Even just enjoying being a student, making new friends, trying out new ideas, engaging with politics and culture, are left out when we become fixated on putting a price on everything.

Almost fifteen years ago, Sir Ron Dearing in his report of the National Committee of Inquiry into Higher Education, argued that government and universities must ‘encourage the student to see him/ herself as an investor in receipt of a service, and to seek, as an investor, value for money and a good return from the investment’ (1997, 22: 9). 

This is now echoed in every university in the country as courses are marketed on the basis of the employability skills students can expect to acquire and statistics on post-graduation employment and earnings.

Such claims are misleading. They are based on the premise that because graduates earn, on average, higher wages than non-graduates there is something inherent in ‘graduateness’ that leads to increased profitability for employers. 

The links between education, productivity and profitability are presented as causal. However, a correlation between having a degree and earning more money does not necessarily mean that one causes the other. 

Instead, it may be the case that education is used as a legitimate means of ranking individuals according to the employment vacancies available, and, as Professor Alison Wolf notes, as more people attain higher level qualifications, or the number of job opportunities decreases, employers merely raise the entry threshold (see Wolf, 2002). 

Investing in higher education, as the Swedish academic Mats Alvesson suggests, is a ‘zero-sum game’ and a matter of positionality.

The financial returns on a degree vary enormously by individual, institution and subject. For some students university may indeed lead directly into a well-paid career, but for others it may not. 

The logic of individuals seeking to maximise financial returns is that all students would choose to study Economics at a top-ranked university. But if every student did this the effect of wage differentials would immediately be negated. 

As the economist Fred Hirsch put it back in 1976, if you’re in a crowd and you want to get a better view then standing on tiptoes is a rational thing to do. At an individual level it works. However, if everyone then stands on tiptoes the effect is negated. 

A degree as an investment in future earnings might be rational at an individual level but it does not logically follow that the same relationship between a degree and earnings holds true at the aggregate level. The wage differentials paid to individuals having a degree depend more upon supply and demand, social prestige and cultural cachet than on any particular skills gained.

The zero-sum qualifications game has a number of consequences. Pierre Bourdieu argued that academic devaluation increases the significance of an individual’s social and cultural capital. A student’s social class background thus becomes more relevant to their future career prospects, not less. 

In a similar way, certain degree courses such as Law and Economics, and degrees obtained from particular institutions such as Oxbridge are more attractive ‘commodities’ - a fact some students will be aware of through school and family connections. 

Promoting education, a positional good, as one with inherent value also leads to qualification escalation. More undergraduates feel under pressure to gain an MA (and with it more debt) in order to distinguish themselves in the labour market.

Instrumentalism is anathema to education

The focus upon gaining a qualification that can be traded in the post-graduation labour market shifts attention away from what is really valuable about education. 

From the rational student’s perspective, engaging in intellectual struggle makes no sense at all if your aim is to secure a certificate in as risk-free and time-efficient way as possible. Exploring interesting and challenging ideas is a distraction to be avoided. 

Most university lecturers will have been confronted by students asking the dreaded question: ‘Will this be in the exam?’ Students adopting such an instrumental approach are least likely to immerse themselves in learning and exploring new knowledge. When students engage with new knowledge they cannot know what their ‘learning outcomes’ might be.

Because learning in higher education cannot be pre-determined, it is problematic to use student choice and value for money as a means of holding universities to account. The assumption behind successive government policies seems to be that student choice will drive up quality in higher education. 

In Students at the Heart of the System [pdf] it is claimed: ‘Better informed students will take their custom to the places offering good value for money. In this way, teaching will be placed back at the heart of every student’s university experience’. 

This assumes students can know - in advance of taking up a university place - what excellent teaching looks like. It assumes a direct relationship between the quality of teaching and the quality of learning without any acknowledgement of the efforts students make themselves. 

Furthermore it is assumed that surveys such as the National Student Survey, which form a key component of many league tables, provide a measure of teaching quality and do not just act as a proxy for student satisfaction.

Value for money is not a measure of quality

For some students, value for money may just mean getting what they want - satisfaction in the short term and a high level qualification - for minimal effort. The role of universities should be to challenge this assumption. 

But the notion that educational quality can be driven upwards by a market based on perceived value for money is more likely to lead to a race to the bottom in terms of educational standards as branding, reputation management and the perception of quality all become more important than confronting students with intellectual challenge. 

Demands for accountability will further erode the academic autonomy of universities, leading lecturers to teach a predetermined curriculum in a way that demonstrates ‘value added’ most effectively. 

Educationally, this may lead to a focus on learnt facts or the demonstration of a narrow range of skills. Qualitative measures of educational development or intellectual engagement are all more difficult to measure effectively.

Universities should be concerned with knowledge not skills; and intellectual capital not economic capital.

For students, university should be about mastering disciplinary knowledge; not unquestioningly - but knowing what has been thought and said by scholars who have studied a subject before you leaves students better placed to interpret, critique and ultimately add to society’s collective understanding of the world we live in.

As a society we can only properly value universities when we value knowledge. Seeing university as a financial investment in employability skills undermines the authority of subject knowledge and risks jettisoning the hard-won intellectual gains of previous generations of scholars.

Joanna Williams is one of four speakers at this evening’s panel discussion at the Strategic Society Centre on Value for Money in Higher Education: What is it and how can it be achieved?. Twitter hashtag: #vfmHE

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Joanna Williams is a Senior Lecturer in Higher Education and Academic Practice at the University of Kent. Her book Consuming Higher Education Why Learning Can’t Be Bought was published by Bloomsbury in 2013. 

Joanna is particularly interested in the impact of marketisation upon universities, knowledge and the public good in higher education, and academic freedom. She is the education editor of the online current affairs magazine Spiked.
Enhanced by Zemanta

Tuesday, January 28, 2014

BOOK REVIEW: The Art of Sociological Argument: Review by @AcademicDiary

The authors reviewed include obvious choices like the big three ‘founding fathers’ - Karl Marx, Emile Durkheim and Max Weber, although Crows discussion of them though is far from obvious.
For example, I was surprised to realize how short their sociological lives were - Marx having greatest longevity at 64 years with Durkheim next at 59 years and Max Weber just 56 years.
Crow leads us through the ways in which they used metaphor, like Weber’s idea of bureaucracy as an ‘iron cage,’ or personification, as in Marx’s unforgettable ‘Mr Moneybags,’ to make their arguments.

Next Crow gives us a trio of American sociological writers - Talcott Parsons, Charles Wright Mills and Erving Goffman.  It’s no secret that I have a weakness for the writing of C. Wright Mills but reading this book I found myself having much more sympathy for Talcott Parsons as a person.

Parsons comes across as patient and even tempered, while Mills seems bombastic and imprecise by comparison. Yet at the same time, Mills is more lasting and alluring to his readership.

Goffman is presented as a sociological humourist with a brilliant eye for more analytical metaphors. However, the purpose of a metaphor for Goffman is to support an argument like scaffolding: “Scaffolds … are to build other things with, and should be erected with an eye to taking them down”.

Erving Goffman did more than any other sociologists to give us a way of understanding society’s back stage, while at the same time being very secretive about his own personal life.

The last part of the book features a chapter on Michel Foucault and another on Ann Oakley. Crow talks very thoughtfully about Foucault’s use of shock tactics and a kind of gothic style in his writing.

Foucault’s rhetoric insists on leaving things open, refusing to claim the final word on any given issue. For Foucault, those who claim knowledge pilfer the voices of their subjects and in the contexts discussions of crime this “shuts the prisoner up (in both senses)”.

Ann Oakley is the only female sociological writer to be included. The chapter dedicated to Oakley’s writing was this reader’s favourite. What Crow does so successfully is to re-enchant books that you think you know already.

It was a real surprise and revelation to be introduced to the range of Oakley’s writing from her classic The Sociology of Housework to policy reports, memoir, fiction and poetry.

The diversity of Oakley’s work is astonishing, she writes: “All writing is an invitation to the imagination … a matter of new arrangements of words, and thus of new forms.”

In a way Oakley’s work is a provocation to find new ways of writing sociologically. Crow quotes Oakley from one of her eighties poetry collections: “who would want a history of articles / typed and dissected, lost and uncredited.” By implication Oakley is challenging us to ask: will the books and articles we’ve written all too speedily for the audit culture inevitably have a short shelf life?

The Art of Sociological Argument is a wonderful and beautifully written book. It has cost me a small fortune in impulse purchases from Amazon.

Reading Crow makes me want to go back to the classics from Riesman’s The Lonely Crowd to Oakley’s Gender of Planet Earth. The book reads like an argument that has been rehearsed and honed through teaching the work of these great sociologists.

Crows conclusion is that there are ways we can improve the way we express our arguments. He offers ten points on how to write sociology more artfully. I have paraphrased them here as follows:
  1. Care for your readers - invite your readers into a conversation with your problem, rather than preach to them by being overly didactic.
  2. Challenge your reader’s presuppositions and surprise them, even if this means being shocking.
  3. Don’t be afraid to use humour and irony to amuse and persuade.
  4. Work with what is counter-intuitive and perplexing and it will open up new insights.
  5. Metaphors and analogues can help get beyond descriptions of phenomena that are readily perceived.
  6. Formulate imaginative questions that invite interesting sociological answers.
  7. Foster a capacity for self-criticism.
  8. Seek to persuade and do not assume that readers will share your agenda or understanding.
  9. Avoid claiming too much in an argument but also be aware of the risks of claiming too little and not explicating its potential.
  10. Literary style is no substitute for content but a good argument is all the better for being well presented.
In fifty years from now such a book will need to be written very differently. It has made me reflect on the transformation of Goldsmiths Sociology during my twenty years here, from a department with less than a handful of female colleagues to one where the majority of Goldsmiths sociologists are women. This year our department celebrates its half century.

Sociology has no future without feminist writers and the male domination of the discipline, as represented in the writers reviewed in this book, simply cannot and should not last. That’s not to mention the ubiquitous whiteness of the authors included in this book.

With this in mind it is interesting to think and perhaps hope for what the sociological pantheon might look like, and how different the discipline will be, when Goldsmiths Sociology celebrates its centenary.

Les Back has been teaching in the Sociology Department at Goldsmiths, University of London since 1993. His main areas of academic interest are the sociology of racism, multiculture, popular culture, music and sound studies and city life.

Curriculum Review: Western Civilisation’s Legacy Has a Dark Side

Western civilisation, history often ignored (Wikimedia Commons)
by Riaz Hassan, Flinders University

The push is currently on for Australia’s national curriculum to place more emphasis on the history of Western civilisation and its values.

But if we accept that the purpose of such an education is to achieve a proper and fuller appreciation of this legacy and its role in the making of the modern world (and Australia), we cannot ignore the many significant elements of its dark side.

Core values

It is commonplace to hear that Judeo-Christian values are the core of Western civilisation. But, ironically, destroying Jewish religious idols was key to historical anti-Semitism in Christian European societies. Jews were accused of various kinds of conspiracies and evil designs.

In recent years, this part of Jewish-Christian history is glossed over in favour of claims that anti-Semitism is unique to Islamic societies. But according to eminent historian Bernard Lewis, Jewish and Muslim strands of theology are far closer to each other than either is to Christianity.

Jews lived under Islamic rule for 1400 years and in many lands. While they were never free from discrimination, Jews were rarely subjected to the persecutions and violent massacres they were in Christian societies.


Colonialism was a brutal historical event. It was unleashed in the 18th century by a number of European countries to reorganise the world for capitalist exploitation and political and cultural domination.

Colonialism was a disgraceful robbery of land and resources from large segments of the human population to satisfy the colonising imperial powers' incessant greed. As the African saying goes:
When white man came we had the land, they had the Bible. Now we have the Bible and they have the land.
The incorporation of colonies into economies of metropolitan countries required the destruction of indigenous economies and subjugation of the natives through military coercion and destruction of local cultures.

Never before have so many societies and cultures been subjected to such oppressive racist exploitation and humiliation.


Colonialism began in the context of the largest human movement in history, involving millions of Europeans, Chinese and Indians migrating to new lands in Africa, the Americas and Australasia.

New countries were established on lands taken forcibly from their indigenous inhabitants who were systematically displaced or destroyed.

Within a century and a half of British occupation, Australia’s indigenous population declined by 80% due to diseases, repression and violence. In the US, the native population declined by 95% within two centuries of European occupation. The same type of destruction was repeated in African and South American colonies.

Global inequalities

The destiny of nations is shaped by demography and geography. Colonialism led to huge distortions in favour of the English-speaking world.

Consider the countries that trace their lineage to Britain: United States, Canada, Australia and New Zealand. Together they have 5.5% of the world’s population and 19% of earth’s land mass. In comparison, China and India have 38% of the world’s population and 8% of earth’s land mass.

These disparities have worked to the great economic advantage of these countries. It has given them exclusive access to huge mineral, agriculture and other vital resources of the land they occupy. This has primarily been for the benefit of their comparatively small populations, exacerbating global economic inequalities.

Exclusion of natives and migrants

The inequalities were created through insidious claims that new colonies were “white man’s countries”, which excluded the “non-whites” and indigenous population from citizenship.

The indigenous people and “non-whites” could only work as indentured labourers, servants and slaves. They had no legal rights to permanent residency and ownership of land and economic resources.

Laws were enacted to deny the natives such rights and to expel non-white workers, or to exclude them from entering the colonies.

Democracy and exclusion

The new British colonies prided themselves on sharing the democratic politics and institutions of English-speaking cultures. They endowed upon themselves distinctive capacities for self-government and democracy, which could only survive in the absence of distinctions of class and colour.

Such ideological prerequisites for democracy made conditions of racial homogeneity imperative and led to large-scale ethnic cleansing.

The notion of the people so crucial to democratic rule was defined in ethnic terms. The establishment of European colonies made previously less diverse societies more diverse.

In such conditions, politically and militarily dominant groups conflated ethnic identity with national identity, conferring upon themselves all the benefits and rights to the exclusion and subjugation, often through violent means, of rivals. Restrictions on immigration sought to forge racial homogeneity.

Anti-white immigration policies have gradually been either abolished or radically changed in these countries, but immigration still remains one of the most emotionally charged political issues in them.

The Abbott government’s obsession with Australia’s border protection, demonising and criminalising asylum seekers and refugees, is one such contemporary illustration.

Ironically, it is happening in a country whose very foundation was laid by illegal immigration and the brutal dispossession of its indigenous population.


Militarism is another legacy of the colonial expansion and its successor states. In 2012, the military spending of the United States, Australia, Canada, the UK and New Zealand amounted to US$794 billion or 46% of total global military spending, with the US accounting for 40%.

By comparison, China and India’s accounted for 9.5% and 2.6% respectively of global military spending.

Military spending will increase in the coming years given the emerging superpower rivalry between China and the US. This will make the world less safe, as well as consuming resources which could more usefully be employed to address global and national inequalities.

Riaz Hassan receives funding from ARC.
The Conversation

This article was originally published at The Conversation. Read the original article.

What Do Research Developers Do?

Astroboy cake (Photo by Tseen Khoo; cake by Shayne Smail)
Astroboy cake (Photo Tseen Khoo; cake Shayne Smail)
by , The Research Whisperer:

Isn’t it brilliant when you learn something within a week of the new year?

When one of my academic buddies asked me in late 2013 what research developers are meant to do, I happily said, “Let me write a blogpost on that!”, and rubbed my hands with glee at the gift of an easy post to knock off in the new year.

I sat down to write this post, and was immediately bogged down in pondering the specificities and individualisation of the role.

I realised that it wasn’t as straightforward as I’d thought. Let me explain:

When I started this job three years ago (thank you, LinkedIn, for the congratulations), I was one of three research developers who were stepping into new positions. My Research Whisperer buddy @jod999 is another from this cohort.

We each had responsibility for one of our institution’s colleges (similar to faculties). There was no-one there before us, and no standing expectations to fulfil. There were expectations, of course, and these are otherwise known as our job descriptions.

As it turns out, though, each of us has cultivated different processes and priorities when carrying out our basic job of helping researchers find money to do their research.

What I’m talking about in this post is how I approach what I do, and the types of things I think of as part of my job even though they may not be stated explicitly on my job description. Below are two of the main threads of my job.

My job description says… What this actually means I do… What this requires…
Help researchers find funding for their research, particularly to diversify the range of schemes they might consider Get up to speed on as many relevant funding schemes as possible and target clusters or individuals about these opportunities through specific emails.When it’s a more widely relevant scheme, I break out the School mailing lists and put together regular funding newsletters.How red-hot the match is between funding schemes and researchers often determines how avidly I flag it. It took longer than I realised to gain a useful, nuanced understanding of who is working on what in the Schools I take care of.There’s the realm of university marketing about  research, and there’s the realm of everyday nascent projects that still need love and resources.There are the high-flyers who tend to score with top-level national grants, and those who have yet to secure their first competitive grant. It is a constant, necessary learning process when trying to accommodate the different stages of projects and people’s track-records. More often than not, it’s a diplomatic coaching exercise.
Assist in the development and review of grant applications I discuss and review potential grant applications with the researchers.They send me drafts of their application, or we meet for chats about the scheme and their project/budget before they write it up.I review and return their drafts, usually tracking any changes and adding comments. Depending on the scale of the application and the punctuality of the researchers, I might do this more than once for their drafts.
I sometimes also nut out the budgets for researchers after we’ve had a chat about what is needed. As I’ve said before, if you conquer the budget, you conquer the project planning.
I manage the review process, which means setting schedules for submission in line with internal and external (funding body) deadlines.
For some schemes, such as the Australian Research Council, we have an intensive, almost year-long process that involves many people across Schools. Smaller schemes may require just one review from me.
I create and streamline grant development resources (e.g. factsheets, scheme summaries, funding calendars, application strategies) to make the development and review of grants as smooth as possible. The Research Whisperer is such a resource.
This aspect can be a very organic. There are researchers who need minimal ushering through grant applications, and those at the other extreme who seek advice on every aspect of their application and feedback.Similarly, there are those who appreciate critical feedback, and those who do not.Finding the best way to approach researchers and their teams to garner the best results (in terms of strong grant submissions) is an arcane craft. At the heart of a successful process is the building of trust: trust that I have the skills and expertise to give them the feedback and assistance that they need, and my trust in them that they will hear (and hopefully act on) the advice that I give them.
How much ‘review’ might mean editing or re-writing is a ‘how long is a piece of string’ question. As a compulsive editor, I tend to do more than I probably should. But then, how much ‘should’ I be doing?

I mentioned above that I work with a particular college. This means that the university’s central research office provides final oversight of grant submissions and institutional sign-offs.

They are also the fonts of knowledge and uber-experts on all aspects of the applications’ compliance and eligibility. As much as I try to keep an eye on these elements as drafts are reviewed, there will always be things I miss. Having a central research office that has your back on these matters is gold.

I’ve been very particular about stating that this is what I do because I know that research offices and staff are organised differently from institution to institution. In addition, research development staff can have different approaches to the levels and type of assistance they offer.

I am lucky enough to be working in a college where I’ve had the chance to get to know researchers and their work well, and connect with a community of scholars who value my experience and advice.