Never seems but alwaysBy prof Dr Sohail Ansari
“Say (O Muhammad) the things that my Lord has
indeed forbidden are Al-Fawahish, whether committed openly or secretly, sins,
unrighteous oppression, joining partners with Allah for which He has given no
authority, and saying things about Allah of which you have no knowledge.” (Al-A’raf: 33)
Others Know but cannot define makes violation
possible
· Never seem but always
doing makes violation of censorship laws sustainable.
Quotes:
· Commercial speech is like obscenity...
we can't seem to define it, but we know it when we see it. Jef I. Richards
· It's true that obscenity is a matter of taste and in the eye
of the beholder. Christopher
Hitchens
The difference between approach
and technique.
Techniques are
approaches to carry out your methods. There may be differences at
the level of dictionary definitions, but in common usage, they are essentially the same. Interview is a data collecting method; structured
interview is a technique.
A structured interview (also
known as a standardized interview or a researcher-administered
survey) is a quantitative research method commonly employed in survey research.
The aim of this approach is to ensure that each interview is
presented with exactly the same questions in the same order.
Structured Interview is one in which a particular set
of predetermined questions are prepared by the interviewer in advance. Unstructured Interview
refers to an interview in
which the questions to
be asked to the respondents
are not set in advance.
Technique and method can sometimes be used
interchangeably.
Here is the main difference:
Method: a settled kind of procedure, usually according to a definite, established, logical, or systematic plan: the open-hearth method of making steel; one method of solving a problem.
Technique: the manner and ability with which an artist, writer, dancer, athlete, lawyer, or the like employs the technical skills of a particular art or field of endeavor so as to effect a desired result.
Here is the main difference:
Method: a settled kind of procedure, usually according to a definite, established, logical, or systematic plan: the open-hearth method of making steel; one method of solving a problem.
Technique: the manner and ability with which an artist, writer, dancer, athlete, lawyer, or the like employs the technical skills of a particular art or field of endeavor so as to effect a desired result.
Content analysis
Content analysis is a research method for studying communication artifacts. Social scientists use
content analysis to quantify patterns in
communication. Practices and philosophies of content analysis vary between scholarly communities. They all involve systematic reading or observation of texts or
artifacts which are assigned labels (sometimes called codes) to indicate the presence of interesting, meaningful
patterns. After labeling a large set of texts, a researcher is able to
statistically estimate the proportions of patterns in the
texts, as well as correlations between patterns. Computers
are increasingly used in content analysis. Popular qualitative data analysis
programs provide efficient work-flow and data management tools for labeling.
Simple computational techniques can provide descriptive data such as word
frequencies and document lengths. Machine learning classifiers can greatly
increase the number of texts which can be labeled, but the scientific utility
of doing so is a matter of debate.
1. an object made
by a human being, typically one of cultural or historical interest.
"gold and silver artefacts"
2.
something observed in a scientific investigation or
experiment that is not naturally present but occurs as a result of the
preparative or investigative procedure.
"the curvature of the surface is an artefact of the
wide-angle view"
Goals of Content Analysis
Content analysis is best understood as a broad
family of techniques. Effective researchers choose techniques that best help them
answer their substantive questions. That said, according to Klaus Krippendorff,
six questions must be addressed in every content analysis:
1. Which data are
analyzed?
2. How is the data defined?
3. From what population
are data drawn?
4. What is the relevant
context?
5. What are the
boundaries of the analysis?
6. What is to be
measured?
The simplest and most objective form of content analysis
considers unambiguous characteristics of the text such as word
frequencies, the page area taken by a newspaper column, or the duration of
a radio or television program. Analysis of simple word
frequencies is limited because the meaning of a word depends on
surrounding text. Keyword In Context routines
address this by placing words in their textual context. This
helps resolve ambiguities such as those introduced by synonyms
and homonyms.
A further step in analysis is the distinction between
dictionary-based (quantitative) approaches and qualitative approaches. Dictionary-based
approaches set up a list of categories derived from the frequency list of words
and control the distribution of words and their respective
categories over the texts. While methods in quantitative content analysis in
this way transform observations of found categories into quantitative
statistical data, the qualitative content analysis focuses more on the
intentionality and its implications. There are strong parallels
between qualitative content analysis and thematic analysis.
Machine translation can use a method based on dictionary entries, which means
that the words will be translated as a dictionary does – word by word, usually without much
correlation of meaning between them.
a mutual relationship or connection between
two or more things.
"research showed a clear correlation
between recession and levels of property crime"
Computational Tools
More generally, content analysis is research using the
categorization and classification of speech, written text, interviews,
images, or other forms of communication. In its beginnings, using the first
newspapers at the end of the 19th century, analysis was done manually
by measuring the number of lines and amount of space given a subject.
With the rise of common computing facilities like PCs,
computer-based methods of analysis are growing in popularity. Answers to open
ended questions, newspaper articles, political party manifestoes, medical
records or systematic observations in experiments can all be subject to
systematic analysis of textual data.
By having contents of communication available
in form of machine readable texts, the input is analyzed for frequencies
and coded into categories for building up inferences.
Computation is any type of calculation that includes
both arithmetical and non-arithmetical steps and follows a well-defined model
understood and described
Reliability
Robert Weber notes: "To make valid inferences from the
text, it is important that the classification procedure be reliable in the
sense of being consistent: Different people should code the same text in the
same way". The
validity, inter-coder reliability and intra-coder reliability are
subject to intense methodological research efforts over long
years. Neuendorf suggests that when human coders are used in content
analysis two coders should be used. Reliability of human coding is often
measured using a statistical measure of intercoder reliability or
"the amount of agreement or correspondence among two or more coders"
Kinds of Text
There are five types of texts in content analysis:
1.
written text, such
as books and papers
2.
oral text, such as speech and theatrical performance
3.
iconic
text, such as drawings, paintings, and icons
4.
audio-visual text, such as TV programs, movies, and videos
5.
hypertexts, which are
texts found on the Internet
History
Over the years, content analysis has been applied to a variety
of scopes. Hermeneutics and philology have long used content analysis
to interpret sacred and profane texts and, in not a few cases, to
attribute texts' authorship and authenticity.
In recent times, particularly with the advent of mass communication,
content analysis has known an increasing use to deeply analyze and understand
media content and media logic. The
political scientist Harold Lasswellformulated
the core questions of content analysis in its early-mid
20th-century mainstream version: "Who says what, to whom, why, to what
extent and with what effect?". The
strong emphasis for a quantitative approach started up by Lasswell was finally
carried out by another "father" of content analysis, Bernard Berelson, who proposed a definition of
content analysis which, from this point of view, is emblematic:
"a
research technique for the objective, systematic and quantitative description
of the manifest content of communication".
Quantitative content analysis has enjoyed a renewed popularity
in recent years thanks to technological advances and fruitful application in of
mass communication and personal communication research.
Content analysis of textual big data produced by new media, particularly social media and mobile devices has become popular. These
approaches take a simplified view of language that ignores the complexity of semiosis, the process by which
meaning is formed out of language. Quantitative content analysts have been
criticized for appealing to statistical measures to justify the
objectivity and systematic nature of their methods while ignoring the
limitations of their approach.
Recently, Arash Heydarian Pashakhanlou has argued for a
combination of quantitative, qualitative, manual and computer-assisted in a
single study to offset the weaknesses of a partial content analysis and enhance
the reliability and validity of a research project.
Content analysis can also be described as studying traces, which are documents from past times,
and artifacts, which are non-linguistic documents. Texts are understood to be
produced by communication processes in a broad sense of that phrase—often
gaining mean through abduction.
Personal
Communication
1. It's
the communication between person to person.
2. It
provides necessary communication between two people making man a social animal.
3. We have
letters, e-mails, SMSs, telephone and mobile phone facilities which also
includes STD and ISD services.
Mass
Communication
1. It's
the communication among masses.
2. It
provides entertainment as well as creates awareness among the masses.
3. It
includes radio, television, newspapers, magazines, books, films, etc.
Abductive reasoning is to abduce (or take
away) a logical assumption, explanation, inference, conclusion, hypothesis, or
best guess from an observation or set of observations. Because the conclusion
is merely a best guess, the conclusion that is drawn may or may not be true.
Examples of Abductive Reasoning
Jury duty decisions are one example of
abductive reasoning. Let's say you're a juror and the defendant looks like the
image of the man on the security camera robbing the bank. He stutters and
pauses, like he is guilty, when answering questions posed by the prosecutor.
You conclude, as a juror on your first day as a member of the jury, that he is
guilty, but you are not certain. Here, you have made a decision based on your
observations, but you are not certain it is the right decision.
Daily decision-making is also an example
of abductive reasoning. Let's say you're stuck in traffic on the interstate and
see ambulance and police lights about a half mile ahead. There is an exit
coming up and you could take some backroads and then get back on the interstate
after the accident. You listen to the traffic report on the radio. You look and
see if the exit looks congested. Taking all the information at hand, you make
the decision to stay on the interstate and wait for the accident to clear. You
made the best decision you could given all of the observations.
Are politicians motivated by power?
In a broad sense that may be true since a person who is utterly indifferent to
power would not make the effort to become one; however, there are many other
factors to be considered as well.
More elaborate description
The method of content analysis enables the
researcher to include large amounts of textual information and systematically
identify its properties, such as the frequencies of most used keywords by
locating the more important structures of its communication
content. Such amounts of textual information must be categorized to
provide a meaningful reading of content under scrutiny. For
example, David Robertson created a coding frame for a
comparison of modes of party competition between British and American
parties. It was developed further in 1979 by the Manifesto
Research Group aiming at a comparative content-analytic
approach on the policy positions of
political parties. This group created the Manifesto
Project Database.
Since the 1980s, content analysis has become an increasingly
important tool in the measurement of success in public relations (notably media relations) programs and the assessment
of media profiles,
such as political media slant—orientation towards one of the two major parties. In
1982, John Naisbitt published
his popular Megatrends, based on content analysis in the US media.
In analyses of this type, data from content analysis is usually combined with media data (circulation,
readership, number of viewers and listeners, frequency of publication). It has
also been used by futurists to identify trends.
The creation of coding frames is intrinsically related to
a creative approach to variables that influence textual content. In
political analysis, these variables could be political scandals, the impact
of public opinion polls, sudden events in external politics, inflation
etc. Mimetic Convergence, created by Fátima
Carvalho for the comparative analysis of electoral proclamations on free-to-air
television, is an example of creative articulation of
variables in content analysis. The methodology describes the construction
of party identities during long-term party competitions on TV, from a dynamic
perspective, governed by the logic of the contingent. This method aims to capture
the contingent logic observed in electoral campaigns by
focusing on the repetition and innovation of themes sustained in party broadcasts.
According to such post-structuralist perspective
from which electoral competition is analysed, the party identities, 'the real' cannot speak without mediations because there
is not a natural centre fixing the meaning of a party structure, it rather
depends on ad-hoc articulations. There is no empirical reality
outside articulations of meaning. Reality is an outcome of power
struggles that unify ideas of social structure as a result of contingent
interventions. In Brazil, these contingent
interventions have proven to be mimetic and convergent rather than divergent
and polarised, being integral to the repetition of dichotomised world-views.
What is a 'Contingent Liability'
A contingent liability is a potential liability that
may occur, depending on the outcome of an uncertain future event. A contingent
liability is recorded in the accounting records if the contingency is probable and the amount of
the liability can be reasonably estimated. If both of these conditions are not
met, the liability may be disclosed in a footnote to the financial statements
or not reported at all.
Uses
Holsti groups fifteen uses of content analysis into three
basic categories:
·
make inferences about
the antecedents of a communication
·
describe and make
inferences about characteristics of a communication
·
make inferences about
the effects of
a communication.
He also places these uses into the context of the basic
communication paradigm.
The following table shows fifteen uses of content analysis in
terms of their general purpose, element of the communication paradigm
to which they apply, and the general question they are intended to answer.
Uses of Content Analysis by Purpose, Communication Element,
and Question
|
||||
Purpose
|
Element
|
Question
|
Use
|
|
Make inferences about the
antecedents of communication
|
Source
|
Who?
|
·
Answer questions of disputed authorship (authorship analysis)
|
|
Encodingprocess
|
Why?
|
·
Secure political & military
intelligence
·
Analyse traits of individuals
·
Infer cultural aspects & change
·
Provide legal & evaluative evidence
|
||
Describe & make inferences about the
characteristics of communication
|
How?
|
·
Analyse techniques of persuasion
·
Analyse style
|
||
Message
|
What?
|
·
Describe trends in communication content
·
Relate known characteristics of sources
to messages they produce
·
Compare communication content to
standards
|
||
Recipient
|
To whom?
|
·
Relate known characteristics of audiences
to messages produced for them
·
Describe patterns
of communication
|
||
Make inferences about the consequences of
communication
|
Decoding process
|
With what effect?
|
·
Measure readability
·
Analyse the flow of information
·
Assess responses to communications
|
|
As it
relates to survey research, content analysis is a research method that is
applied to the verbatim responses given
to open-ended questions in order to code those answers into a meaningful set of categories that lend themselves to further quantitative
statistical analysis. In the words of Bernard Berelson, one of the early
scholars explaining this method, "Content analysis
is a research technique for the objective, systematic,
and quantitative description of the manifest content of communication." By coding
these verbatim responses into a relatively small set
of meaningful categories, survey researchers can create new variables in their
survey data sets to use in their analyses.
If something lends itself to something else, it is suitable for that thing or can be considered in that way:
If something lends itself to something else, it is suitable for that thing or can be considered in that way:
The
novel's complex, imaginative style does
not lend itself to translation.
Content analysis is a method for summarizing any form of content by counting various aspects of the content. This enables a more objective evaluation than comparing content based on the impressions of a listener. For example, an impressionistic summary of a TV program, is not content analysis. Nor is a book review: it’s an evaluation.
Content analysis, though it often analyses written words, is a quantitative method. The results of content analysis are numbers and percentages. After doing a content analysis, you might make a statement such as "27% of programs on Radio Lukole in April 2003 mentioned at least one aspect of peacebuilding, compared with only 3% of the programs in 2001."
Though it may seem crude and simplistic to make such statements, the counting serves two purposes:
- to remove
much of the subjectivity from summaries
- to
simplify the detection of trends.
As you’ll see below, content analysis can actually be a lot more subtle than the above example. There’s plenty of scope for human judgement in assigning relevance to content.
1.
What is content?
The content that is
analysed can be in any form to begin with, but is often converted into written
words before it is
analysed. The original source can be printed publications, broadcast programs,
other recordings, the internet, or live situations. All this content is something that
people have created. You can’t do content analysis of (say) the weather - but if somebody writes a report predicting
the weather, you can do a content analysis of that.All this is content...
Print media |
Newspaper items, magazine articles, books, catalogues |
Other writings |
Web pages, advertisements, billboards, posters, graffiti |
Broadcast media |
Radio programs, news items, TV programs |
Other recordings |
Photos, drawings, videos, films, music |
Live situations |
Speeches, interviews, plays, concerts |
Observations |
Gestures, rooms, products in shops |
Media content and audience content
That’s one way of
looking at content. Another way is to divide content into two types: media
content and audience content. Just about everything in the above list is media content. But when you get
feedback from audience members, that’s audience content. Audience content can be either
private or public. Private audience content includes:- open-ended questions in
surveys
- interview transcripts
- group discussions.
- letters to the editor
- postings to an online discussion
forum
- listeners’ responses in talkback
radio.
Why do content analysis?
If you’re also doing
audience research, the main reason for also doing content
analysis is to be able to make
links between causes (e.g.
program content) and effect (e.g. audience size). If you do an audience survey,
but you don’t systematically relate the survey findings to your program output, (the effect
program produced) you won’t know why your audience might have increased or
decreased. You might guess, when the survey results first appear, but a
thorough content analysis is much better than a guess.For a media organization, the main purpose of content analysis is to evaluate and improve its programming. All media organizations are trying to achieve some purpose. For commercial media, the purpose is simple: to make money, and survive. For public and community-owned media, there are usually several purposes, sometimes conflicting - but each individual program tends to have one main purpose.
As a simple commercial example, the purpose of an advertisement is to promote the use of the product it is advertising: first by increasing awareness, then by increasing sales. The purpose of a documentary on AIDS in southern Africa might be to increase awareness of ways of preventing AIDS, and in the end to reduce the level of AIDS. Often, as this example has shown, there is not a single purpose, but a chain of them, with each step leading to the next.
Using audience research to evaluate the effects (or outcome) of a media project is the second half of the process. The first half is to measure the causes (or inputs) - and that is done by content analysis. For example, in the 1970s a lot of research was done on the effects of broadcasting violence on TV. If people saw crimes committed on TV, did that make them more likely to commit crimes? In this case, the effects were crime rates, often measured from police statistics. The problem was to link the effects to the possible causes. The question was not simply "does seeing crime on TV make people commit crimes?" but "What types of crime on TV (if any) make what types of people (if any) commit crimes, in what situations?" UNESCO in the 1970s produced a report summarizing about 3,000 separate studies of this issue - and most of those studies used some form of content analysis.
When you study causes and effects, as in the above example, you can see how content analysis differs from audience research:
- content
analysis uncovers causes
- audience
research uncovers effects.
The process of
content analysis
Content analysis has
six main stages: - Units of
content
- Preparing
content for coding
- Coding the
content
- Counting
and weighting
- Drawing conclusions
Selecting
content for analysis
Content is huge: the world contains a near-infinite amount of content. It’s rare that an
area of interest has so little content that you can analyse it all. Even when
you do analyse the whole of something (e.g. all the pictures in one issue of a magazine) you will usually want to
generalize those findings to a
broader context (such as all the issues of that magazine). In other words, you
are hoping that the issue you selected is a representative sample. Like
audience research, content analysis involves
sampling. But with content analysis, you’re sampling content,
not people. The body of
information you draw the sample from is often called a corpus –
Latin for body (Corpus :collection of
written texts).
Deciding sample size
Unless you want to
look at very fine distinctions, you
don’t need a huge sample. The same principles apply for content analysis as for
surveys: most of the time, a sample between 100 and 2000 items is enough - as
long as it is fully representative. For radio and TV, the easiest way to sample is by time.
How would you sample programs during a month? With 30 days, you
might decide on a sample of 120. Programs vary greatly in length, so use quarter-hours instead. That’s 4 quarter-hours
each day for a month. Obviously you need to vary the time periods to make sure that
all times of day are covered. An easy way to do this, assuming you’re
on air from 6 am to midnight, is to make a sampling plan like this:
Day
|
Quarter-hours beginning
|
|||
1
|
0600
|
1030
|
1500
|
1930
|
2
|
0615
|
1045
|
1515
|
1945
|
3
|
0630
|
1100
|
1530
|
2000
|
With print media, the same principles apply, but it doesn’t make sense to base the sample on time of day. Instead, use page and column numbers. Actually, it’s a lot easier with print media, because you don’t need to organize somebody (or program a computer) to record the on-air program at regular intervals.
The need for a focus
When you set out to do
content analysis, the first thing to acknowledge is that it’s impossible to be comprehensive.
No matter how hard you
try, you can’t analyse content in all possible ways. I’ll demonstrate, with an
example. Let’s say that you manage a radio station. It’s on air for 18 hours a day, and no one person seems to
know exactly what is broadcast on each program. So you decide that during April
all programs will be taped. Then you will listen to the tapes and do a
content analysis.First problem: 18 hours a day, for 30 days, is 540 hours. If you work a 40-hour week, it will take almost 14 weeks to play the tapes back. But that’s only listening - without pausing for content analysis! So instead, you get the tapes transcribed. Most people speak about 8,000 words per hour. Thus your transcript has up to 4 million words – about 40 books the size of this one.
Now the content analysis can begin! You make a detailed analysis: hundreds of pages of tables and summaries. When you’ve finished (a year later?) somebody asks you a simple question, such as "What percentage of the time are women’s voices heard on this station?"
If you haven’t anticipated that question, you’ll have to go back to the transcript and laboriously calculate the answer. You find that the sex of the speaker hasn’t always been recorded. You make an estimate (only a few days’ work, if you’re lucky) then you’re asked a follow-up question, such as "How much of that time is speech, and how much is singing?"
Oops! The transcriber didn’t bother to include the lyrics of the songs broadcast. Now you’ll have to go back and listen to all those tapes again!
This example shows the importance of knowing what you’re looking for when you do content analysis. Forget about trying to cover everything, because (a) there’s too much content around, and (b) it can be analysed in an infinite number of ways. Without having a clear focus, you can waste a lot of time analysing unimportant aspects of content. The focus needs to be clearly defined before you begin work.
An example of a focus is: "We’ll do a content analysis of a sample of programs (including networked programs, and songs) broadcast on Radio Lukole in April 2003, with a focus on describing conflict and the way it is managed."
(an individual thing or
person regarded as single and complete but which can also form an individual
component of a larger or more complex whole.
"large areas of land made up of smaller
units")
The unit of analysis is the major entity that is being analyzed in a study.
It is the 'what' or 'who' that is being studied. In social science research,
typical units ofanalysis include individuals (most common), groups, social
organizations and social artifacts.
Units of content
To be able to count
content, your corpus needs to
be divided into a number of units, roughly similar in size. There’s no limit to
the number of units in a corpus, but in general the larger
the unit, the fewer units you need. If the units you are counting vary greatly in length, and if you are
looking for the presence of some theme, a long unit will have a greater chance of including that
theme than will a short unit. If the longest units are many times the size of
the shortest, you may need to change the unit - perhaps "per
thousand words" instead of "per web page." If the interviews vary greatly in length, a
time-based unit may be more
appropriate than "per interview."
Units of media content
Depending on the size
of your basic unit, you’ll need to take a different approach to coding. The main options are
(from shortest to longest):- A word or phrase. If you are
studying the use of language, words are an
appropriate unit
(perhaps can also group synonyms together, and include phrases). Though a
corpus may have thousands of words, software can count them automatically.
- A paragraph,
statement, or conversational turn: up to a few hundred words.
- An article. This
might be anything from a short newspaper item to a magazine article or web
page: usually between a few hundred and a few thousand words.
- A large document. This
can be a book, an episode of a TV program, or a transcript of a long radio
talk.
Units of audience content
When you are analysing
audience content (not media content) the unit will normally be based on the data
collection format and/or the software used to store the responses. The types of
audience content most commonly produced from research data are- Open-ended
responses to a question in a survey (usually all on one large file).
- Statements
produced by consensus groups (often on one small file).
- Comments
from in-depth interviews or group discussions. (Usually a large text file
from each interview or group.)
Large units are harder
to analyse
Usually the corpus is
a set of the basic units: for example, a set of 13 episodes in a TV series, an 85-message
discussion on an email listserv over several months, 500 respondents’ answers
to a survey question - and so on. What varies is (a) the number of units in the
corpus, and (b) the size of the units.Differences in these figures will require different approaches to content analysis. If you are studying the use of language, focusing on the usage of new words, you will need to use a large corpus - a million words or so - but the size of the unit you are studying is tiny: just a single word. The word frequencies can easily be compared using software such as Wordsmith.
At the other extreme, a literary scholar might be studying the influence of one writer on another. The unit might be a whole play, but the number of units might be quite small - perhaps the 38 plays of Shakespeare compared with the 7 plays of Marlowe. If the unit is a whole play, and the focus is the literary style, a lot of human judgement will be needed. Though the total size of the corpus could be much the same as with the previous example, far more work is needed when the content unit is large - because detailed judgements will have to be made to summarize each play.
Dealing with several units at once
Often, some units overlap other units. For example, if you ask viewers of a TV program what they like most about it, some will give one response, and others may give a dozen. Is your unit the person or the response? (Our experience: it’s best to keep track of both types of unit, because you won’t know till later whether using one type of unit will produce a different pattern of responses.)
Comments
Post a Comment