Title: 500 Grammar based conversation questions with explanations of grammar points
Authors: Larry Pitts
Publisher: ESL Conversation Questions Publishing
Year of publication: 2015
Companion resources: NA
Source: Print copy bought from Amazon India
What really attracted to this book was the caption “Conversation questions designed to elicit the thirty most common grammar points”. I increasingly find myself in situations where I need to answer the question “how can I elicit this target language?’
500 Grammar based conversation questions is a large book in terms of dimensions but it’s fairly slim both in terms of its page count and contents. It has lists of questions prefaced by a brief explanation of the target language. In principle, this could still be invaluable to new teachers. However, almost every single question includes the target language.
As … as : Are cats as fun as dogs?
Present perfect: What are some good restaurants you’ve eaten at?
Used to: Who did you use to play with in elementary school?
Will : What will happen to privacy in the future?
This is consistent throughout the book with the exception of the section on imperatives which has scenarios that would prompt the use of the target language:
Imperatives: What’s a card game from your country? How do I play it?
So I gather that the author’s interpretation of the word ‘elicit’ is different from how I see it. I think by elicit, he means targeted practice and he’s got some commentary at the back about using these questions in the classroom. He’s essentially describing a stage of the lesson where we provide practice with language that’s been taught as opposed to the language presentation stage which is what I had in mind.
From that perspective, this book isn’t all that useful. It contains suggested topics along with the target language in the form of a question. These sorts of conversation prompts are more effective when they are aligned to learner interests and the context of the lesson. In How to Teach Speaking, Thornbury describes criteria for effective speaking tasks and there are two that I reckon are really critical: productivity and purposefulness. I doubt whether prompts like “Where should I go to buy electronics?” will achieve either criterion in the context of advice.
On the other hand, I suppose for new teachers, the questions could be a helpful starting point but I don’t see them dipping into 500 Grammar based conversation questions for too long.
Earlier this year, I attended a course organised by the Hornby Regional School on designing communicative language assessments in Bangladesh. The course was taught by Dr. Rita Green, from Lancaster University, who is a research leader in the field of language assessments. My biggest take-away from the course was an alternative approach to designing listening tests called text mapping. Text mapping is a technique that Dr. Green conceived as a way of addressing some of the issues test designers experience when they select items from a listening text for a test. In January, when I was at the course, the technique was literally hot off the press and her new book Designing Listening Tests had just been published.
Text mapping questions prevailing practices for selecting items in a sound file for a test. Here’s what I normally do and perhaps you do something similar. I usually skim the transcript to get a sense of the text and maybe write a gist listening question and then read it again to come up with some listening for specific information questions. I might then listen to the clip to ensure that the accent or speed isn’t too challenging for the target learners.
Dr Green challenged this practice and these two quotes she cited drove the point home:
A transcript and the speech it represents are not the same thing, the original is a richer, contextualized, communicative event.
Life doesn’t come with a tapescript.
Text mapping attempts to address this gap in how we deal with listening texts. But, before we get on to the actual process, it’s important to distinguish between Specific Information & Important Details (SIID) and Main Ideas and Supporting Details (MISD). I think in teacher training, when we refer to these two listening strategies using the oft-used terms, listening for specific information and listening for detailed understanding, we inadvertently obfuscate what they really are. Dr Green differentiated the two in a way that was very easy to understand.
SIID requires selective listening. We listen for information such as dates, times, places, names, prices, percentages, numbers, measurements, acronyms, addresses, URLs, adjectives and nouns.
MISD requires careful listening. We listen for ideas, examples, reasons, clauses (nouns + verbs), descriptions, explanations, causes, evidence, opinions, conclusions, recommendations and points.
Text mapping can be used for gist, SIID and MISD but I’m going to describe the process for SIID which is what I experienced at the course and subsequently tried out on some unsuspecting colleagues.
Choose a level appropriate audio clip and organise a quiet room with good quality speakers. The text mappers you assemble should not have heard the clip before. The clip should be short (approximately 30 seconds)
You will need to prepare an Excel sheet with SIID from the clip along with the time stamps of individual items which means you will need to text map the sound file yourself.
You need at least three text mappers to ensure validity. A larger pool will increase validity. Explain to the text mappers that they are going to be listening for Specific Information and Important Details. You may need to ICQ this to ensure that all the text mappers are on the same page about what constitutes SIID. SIID is usually not more than one or two words.
3. Listening to the sound file
Play the clip only once and ask the text mapper to listen for SIID. They must not make any notes during this time. When the clip finishes, ask the text mappers to write down SIID. The clip is played only once because Dr. Green suggested that over exposure could lead to too many items being identified.
4. Text mapping
Ask the text mappers to tell you the SIID they wrote down. Enter these into an Excel worksheet. Poll the group to see who else got this SIID and maintain a tally. If you have variations in the response because they only heard a part of it or misheard it, record these as separate entries. After you’ve finished eliciting these responses, copy paste the time stamps that you’d prepared earlier. You’re likely to get items that are not SIID. A simple test is to check if the information being offered has a noun and verb in which case it is MISD and not SIID.
There may be variations with numerals because in real life we tend to write down numbers immediately or ask for them to be repeated. The test designer will need to keep this mind when selecting an item which has achieved consensus through a number of variations such as Room No. 4045, 4045, 4054, 4055 etc.
The text mappers might not give you items chronologically which is alright. You’ll just need to reorder them so that they appear sequentially in the worksheet. You’ll also need to be strict about disallowing any responses that were not written down. I experienced this with my colleagues when several said “Oh I remembered that but I didn’t write it down.”
Look at the SIID that a majority of the text mappers were able to identify. These are the items you ought to be testing. However, there are some things to bear in mind. Items at the very beginning of the clip should be disregarded even if you reach consensus (consensus means at least two thirds of the text mappers have identified it) with it because a test candidate may miss it merely because she is orienting herself to the clip in the first few seconds. Additionally, if two items appear within four to six seconds of each others, we ought to test one but not the other. Items should be evenly distributed through the sound file. It’s also important that all items test the same kind of listening behaviour – in this case selective listening for SIID.
6. Writing the test
The next step is to design questions using the items that were identified.
Reflections on text mapping
Here’s one that a colleague and I worked on with a sound file on making a hotel reservation. By text mapping a sound file, you have a systematic approach for identifying what you ought to test as we did with this file. The fact that you are listening to the file as opposed to reading a transcript facilitates the selection of more authentic items i.e., that reflect how we receive and process information in real situations. Selecting items from a transcript (and this often happens with me) may result in the testing of obscure items which we may not even register in a real life context.
When we ran this exercise with a group of our colleagues, we faced some resistance to the concept. The main bone of contention was that we were testing memory instead of listening skills. I think the clip we selected (at 2 min 10 seconds) was far too long. I recall Dr. Green using really short clips with us (around 20-30 seconds). In a Google Preview of her book, I also recall seeing something about chunking the clip for MISD and allowing text mappers to make notes while listening for SIID with longer clips. Unfortunately, those chapters are no longer available online.
However, our colleagues came around when they saw the extent to which there was consensus for the items outlined in yellow in the preceding table and interestingly this coincided with an earlier round of text mapping with another group of text mappers.
I’m still a little uncertain about the relationship between the text mappers who are selected and the items that are identified through consensus. Text mapping as a process is designed not just for test designers but also to empower teachers to work collaboratively to design meaningful tests. Therefore, wouldn’t the items selected depend on the language proficiency level of the text mappers? I suspect that in a monolingual English-speaking environment, the results of text mapping may be different than one where English is not the L1 like I experienced in Bangladesh. Further, what kind of impact does this have on item selection from the learner’s perspective, taking into consideration their own language proficiency. While theoretically, a sound file at B1 should have all of its items at B1 but in reality, this may not be the case.
These unanswered questions not withstanding, text mapping is a useful alternative to the somewhat random way in which listening tests are currently constructed. If you try out text mapping, do let me know about your experiences in the comments section.
Green, S. Designing Listening Tests: A Practical Approach. Palgrave Macmillan: 2017.
Helgesen in Wilson, J. J. How to teach listening. Pearson: 2008.
Lynch, D. Teaching Second Language Listening. OUP: 2010.
Many thanks to Azania Thomas for creating the text mapping sheet that I’ve used in this blog.
Dr. Green’s book is unfortunately really expensive (as interesting ELT books tend to be). You can read a preview here. It includes some relevant chapters on text mapping for gist and issues with listening texts and working with authentic sound files.
A couple of years ago, I met a teacher (let’s call her Meera) at a conference who’d been working with tertiary institutions on a freelance basis. Meera wanted to get into corporate training and was wondering if she could partner with me on a project. I didn’t really have anything for her at the time but a few months later I found myself on the phone with a client who desperately wanted a bespoke solution rolled out for an urgent need. My schedule was chock-a-block at the time and I didn’t have the bandwidth (as we say in corporate circles) to design the materials and deploy someone else to teach the course. So I asked them to take things forward with Meera (who I judged as fairly competent), which they did.
Little did I realise that I’d done them both a great injustice. Meera was utterly unprepared for the engagement and the client had assumed that she was on the ball because I’d recommended her. I know we often bandy about the bland encouragement to General English teachers that Business English and ESP courses don’t require them to be experts in business, management or a particular industry and that their expertise in language will help them sail through. I’m afraid it’s a claim that’s simultaneously true and false.
The uninitiated teacher or trainer risks missing the forest for the trees. Meera apparently did an intensive needs analysis but her focus was very narrow and the sorts of information she collected caused her churn out or select run of the mill language exercises with token nods to the business setting. Her materials were completely divorced from the context that her learners worked in and required language for and the specific need that she had been called in to address.
Knowing what to look for and how to feed these insights into materials-design comes with experience, and it helps if you’ve spent time with a corporate setup in a business/operational role i.e., not training or teaching. In the absence of that kind of experience, Frendo’s How to write corporate training materials could be a useful primer.
A key strength of this book is the extent to which it aligns practices to what typically happens within organisations. The idea that we should “investigate discourse practices” instead of merely collecting language needs, strikes a chord with me. Beyond educating the practitioner about process and projects, and SOPs and SIPOC charts, Frendo offers a series of incisive tasks that raise awareness of language, strategies and issues we ought to consider when developing corporate training materials.
My favourites include task 6 which draws on research by Williams (1998) comparing the language prescribed by coursebooks for functions within meetings with actual usage.
Examples from contemporary textbooks:
You’ve got a point there.
I totally agree with you.
Absolutely. / Precisely. / Exactly.
Examples from real-life business meetings:
implied by the function ‘accept (e.g., yes)
implied by not disagreeing
Frendo goes on to state:
It is easy to see why St John described business English as ‘a material-led movement rather than a research-led movement’ (p15). It is writer’s intuition, rather than what we know about discourse, which has been leading the way. And many commentators feel that not much has changed since that article was written.
There are also several transcript-based tasks that draw attention to features of Business English as a Lingua Franca (BELF) including “code-switching, ellipsis, silence, incomplete utterances, repetition, deviation from ‘standard’ English” all of which Frendo suggests as worth exploring in the training room.
I found the section on techniques for gathering this kind of evidence interesting. There were some that I was familiar with such as language snippets, recordings, corpus analysis, work shadowing and questionnaires and others that I’ve never actually used such as simulated conversations and anecdote circles (sort of like an FGD but more informal).
Task 11 is another interesting one. It asks the reader to analyse an annual report and identify authentic texts that could be used for different roles and needs. I wonder how many Business English trainers have actually read an annual report.
There are also case studies of training projects Frendo has worked on and the solutions he facilitated. Again, we see a strong integration of what actually happens in organizations such as scrum meetings and how this might unfold in a training programme.
How to write corporate training materials is a useful compilation of practices for someone who is making the transition from General English to Business English/ESP and it’s particularly relevant to those who are working as independent consultants. However, it’s also full of insights for practitioners who have been consulting in corporate contexts for a while because it questions some of our practices, especially when we rely on intuition, rather than observation and research to inform our design.
I’m on the hunt for quick icebreakers and energizers for use in the teacher training I facilitate for the state sector where establishing some sort of bonhomie is extremely critical for keeping people focused.
Icebreakers is divided into games, exercises, and simulations (a slightly arbitrary distinction). The overall feel is very dated and the activities are overtly complex. The first three alone are quite representative of the rest of the book.
Birthday Scores: Participants compare each others birthdays and form groups to get the highest scored based on a point system (born on the same day 12 points each etc.). Each activity is divided into Facilitator’s notes and Players’ notes which also includes some kind of handout. The instructions in the Player’s notes are generally so verbose that I suspect participants would spend most of the activity trying to understand the written instructions.
Diverse points: Participants meet and talk in a leisure area and then move over to a negotiation area where they allocate 100 points between themselves using one of four combinations 90/10, 80/20, 70/30 or 60/40. This activity has some potential but it’s not clear what participants are meant to be negotiating about (Who seems to have the best personality? Eeek!)
Growing paper clips: Take a look at these instructions for an activity where participants join their own paper clips to others while introducing themselves and mind you these are the instructions that are meant to be handed over to the participants.
It’s hard to understand why you would want to run a simulation (in fact they’re not really simulations, they’re role plays) as an icebreaker. For instance, in Monolith, participants pretend to be archaeologists and sociologists examining a stone object in the south American jungle.
I’m sure I might be able to dig out some ideas I could adapt from this book but I just don’t have the patience to go through each activity carefully. On the flip side, excerpts from this book could serve as a useful demonstration of how not to write activities.
Icebreakers is available as a low-priced edition for India but it’s really not one for the resource bookshelf.
Mura Nava, through his blog , Twitter handle and the G+ Corpus Linguistics community, regularly shares resources for using corpora in the classroom. He’s also been on the mentoring team for Lancaster University’s Corpus Linguistics MOOC on Future Learn. It’s clearly an area that he’s passionate about and what I really appreciate is all the practical classroom-oriented stuff on on his blog for us corpulent novices (do they collocate?).
Quick Cups of COCA distills all of that corpus goodness into a succinct booklet for language teachers who’d like to use corpora to prepare for lessons or demonstrate language use and answer queries during lessons. The booklet is structured around a set of search functions you can execute on the British National Corpus (BNC) and Brigham Young University’s Corpus of Contemporary American English (BYU-COCABYU-COCA).
While I use corpora intermittently and am familiar with the functions that help me respond to those perennial questions that start with “What is the difference between … ” or “When do we use …” , I wasn’t aware that COCA-BYU could be used to find synonyms (search term #2). Oh, it’s so wonderfully simple and the best part is that it shows you the frequency of occurrence, and lets you explore co-text. Bye bye meaningless list of words on Thesaurus.com, hello COCA.
The other interesting search strings include ‘Lemma & POS’ (search term #5) which allows the user to look for all the possible collocates of a word for all the forms of the target word (lemma), and ‘quantifier/determiner + of + relative pronoun’ (search term #4) which lets you query the corpus for examples of usage for phrases like ‘all of which’ which could possibly become marker sentences in your lessons.
All the screenshots from the corpus are hyperlinked to the actual searches on COCA which means this concise book potentially offers the reader an afternoon of happy COCA exploration.
Mura prefaces the booklet by suggesting that it’s intended for those are at least slightly familiar with the BYU-COCA corpus interface but that parts of it could be of value to absolute beginners as well. I’m not so sure absolute beginners would benefit because teachers I’ve introduced the BNC or COCA to have been a bit intimidated by it all, and some of those search strings are a tad scary looking for the uninitiated. However, I think I could use Quick Cups of COCA as a follow-up to an introductory session on using corpora where teachers explore language use through tasks that challenge them to deploy the search functions from the book.
Quick Cups of COCA is a must read if you’d like to know more ways of exploiting the BNC or COCA for your learners. Mura has very kindly made it available as a complimentary download from Smashwords in a range of formats.
Title: A Handbook of Spoken Grammar. Strategies for Speaking Natural English
Authors: Ken Paterson, Caroline Caygill and Rebecca Sewell
Publisher: Delta Publishing
Year of publication: 2011
Companion resources: Audio CD
Source: British Council Library
I actually assumed this was a book on methodology but it is in fact a study book for students. There isn’t much by way of information for the teacher except a cursory note that suggests the book is meant to be used for self-study but could potentially also be used in the classroom. However, the pages are not marked photocopiable and at ₹1,734 ($26), I don’t see how this could be used as a student resource book unless you infringe copyright.
It would have been interesting to see a more detailed comparison of written and spoken grammars beyond a tiny note in the introduction. The authors suggest that “the features of spoken grammar help to create an easy-going, natural kind of English.” They also add that this type of grammar is more economical (Any luck?), simpler (I said to Anne, ‘look are you sure?’), less direct and therefore more polite (What sort of job do you do, then?) and provides the speaker with choices about when to reveal the subject (It’s such a wonderful place to spend a few days in, New York).
There are 20 units and each contains guided discovery and practice exercises for specific focus areas, with answers at the back. For example, the tenth unit titled Say Less focuses on spoken ellipsis (A: Would anyone buy anything at that market. B: Oh, I would). In a sense I feel the title of this book is a bit deceptive. The units seem to largely cover language features that would help the speaker perform discourse and/or pragmatic functions such as sounding more polite (unit 7) and being vague (unit 8). For instance, sounding more polite is a round up of the usual suspects: softeners (would you mind …?), preparators (I was hoping) etc.
As the book is intended for learners, it omits metalanguage of the sort that I’ve just mentioned. This, however, would have been interesting input from a teaching perspective. The authors make no mention of what language level these exercises are pitched at in the book (although the book’s site says B1 and above). Some units seem appropriate for students with a lower proficiency (Unit 16: Make short responses to agree or show interest & Unit 12: How to use oh, ah, wow, ouch, etc.), and others seem positioned for more advanced learners (Unit 18: Follow your partner which explores a sort of backchanneling but using synonymous phrases – A: It’s hot today isn’t it? B: Boiling! Shall we sit in the garden).
There is a dearth of good classroom materials on spoken discourse and A Handbook of Spoken Grammar might address that need. I tend to get excited about incorporating discourse and pragmatics into my courses only to find that I introduce it at the wrong time, treat it too subtly, make it far too explicit, or overestimate my learners’ abilities. It would have been useful to have some more guidance on using these units effectively and an exploration of the challenges of facilitating a more natural speaking style.
Delta Publishing offer the contents page and sample units as free downloads.
Have you used this book with your learners? I’d love to hear from you about your experience.