In your research proposal, you will also discuss how you will conduct an analysis of your data. By the time you get to the analysis of your data, most of the really difficult work has been done. It's much more difficult to define the research problem, develop and implement a sampling plan, develop a design structure, and determine your measures. If you have done this work well, the analysis of the data is usually a fairly straightforward affair.
Before you look at the various ways of analyzing and discussing data, you need to review the differences betweenqualitative research/quantitative researchandqualitative data/quantitative data.
Why do I have to analyze data?
The purpose of analyzing data is to obtain usable and useful information. The analysis, regardless of whether the data is qualitative or quantitative, may:
- describe and summarize the data.
- identify relationships between variables.
- compare variables.
- identify the difference between variables.
- forecast outcomes.
Earlier, you distinguished between qualitative and quantitativeresearch. It is highly unlikely that your research will be purely one or the other – it will probably be amixtureof the two approaches.
For example, you may have decided to ethnographic research, which is qualitative. In your first step, you may have taken a small sample (normally associated with qualitative research) but then conducted a structured interview or used a questionnaire (normally associated with quantitative research) to determine people’s attitudes to a particular phenomenon (qualitative research). It is therefore likely that your mixed approach will take a qualitative approach some of the time, and a quantitative approach at others depending on the needs of your investigation.
A source of confusion for many people is the belief that qualitative research generates just qualitative data (text, words, opinions, etc) and that quantitative research generates just quantitative data (numbers). Sometimes this is the case, but both types of data can be generated by each approach. For instance, a questionnaire (quantitative research) will often gather factual information like age, salary, length of service (quantitative data) – but may also collect opinions and attitudes (qualitative data).
When it comes todata analysis, some believe that statistical techniques are only applicable for quantitative data. This is not so. There are many statistical techniques that can be applied toqualitativedata, such as ratings scales, that has been generated by aquantitative research approach. Even if a qualitative study uses no quantitative data, there are many ways of analyzing qualitative data. For example, having conducted an interview, transcription and organization of data are the first stages of analysis. This would then be continued by systematically analyzing the transcripts, grouping together comments on similar themes and attempting to interpret them, and draw conclusions.
1. Manchester Metropolitan University (Department of Information and Communications) and Learn Higher offer a clear introductory tutorial to qualitative and quantitative data analysis through their Analyze This!!! site. In additional to teaching about strategies for both approaches to data analysis, the tutorial is peppered with short quizzes to test your understanding. The site also links out to further reading.
Complete this tutorial and use your new knowledge to complete yourplanning guide foryour data analysis.
There are many computer- and technology-related resources available to assist you in your data analysis.
Online General Resources
Quantitative Data Analysis Resources
Common Computer-Aided Qualitative Data Analysis Packages
There are many computer packages that can support your qualitative data analysis. The following site offers a comprehensive overview of many of them: Online QDA
2. When you are done, you will also need to address concerns about the reliability and validity of your possible results. Use these questions and explanations for ideas as you complete your planning guide for this section.
Some common worries amongst researchers are:
- Will the research I’ve done stand up to outside scrutiny?
- Will anyone believe my findings?
These questions are addressed by researchers by assessing the data collection method (the research instrument) for itsreliabilityand itsvalidity.
Reliability is the extent to which the same finding will be obtained if the research was repeated at another time by another researcher. If the same finding can be obtained again, the instrument is consistentor reliable.
Validity is understood best by the question: ‘Are we measuring what we think we are measuring?’ This is very difficult to assess. The following questions are typical of those asked to assess validity issues:
- Has the researcher gained full access to the knowledge and meanings of data?
- Would experienced researchers use the same questions or methods?
No procedure is perfectly reliable, but if a data collection procedure is unreliable then it is also invalid. The other problem is that even if it is reliable, then that does not mean it is necessarily valid.
Triangulation is crosschecking of data using multiple data sources or using two or more methods of data collection. There are different types of triangulation, including:
- time triangulation– longitudinal studies
- methodological triangulation – same method at different times or different methods on same object of study
- investigator triangulation– uses more than one researcher.
Sampling error is a measure of the difference between the sample results and the population parameters being measured. It can never be eliminated, but if random sampling is used, sampling error occurs by chance but is reduced as the sample size increases. When non-random sampling is used this is not the case.
Basic questions we need to ask to assess a sample are:
- Is the sample random and representative of the population?
- Is the sample small or large?
All errors, other than sampling errors, are non-sampling errors and can never be eliminated. The many sources of non-sampling errors include the following:
- Researcher error– unclear definitions; reliability and validity issues; data analysis problems, for example, missing data.
- Interviewer error– general approach; personal interview techniques; recording responses.
- Respondent error– inability to answer; unwilling; cheating; not available; low response rate.
This section was discussed in Elements of the Proposal, where there are many online resources, and you have reflective journal entries that will support you as you develop your ideas for reliability and validity in your planning guide. In addition this writing tutorial specifically addresses the ways in which this can be explained in your research proposal.
Return to Writing the Proposal - Different Pathways