Snowball Sampling

| No Comments


When selecting a sample to research there are various non-representative sampling techniques we discussed in class. One of these techniques is referred to snowballing. Dana Grinshpan from Government Executive has written an article titled "3 Bad Research Techniques That Will Ruin Your Work". These three techniques are: snowball sampling, snowball research and snowball point of view. As we learned in class, snowball sampling is a research method in which you identify a group that you wish to study and then ask members of the group to identify acquaintances to also join the study.

The article notes that although snowball sampling allows us to reach subjects we may have been unable to reach, the sample group is no longer random and no longer represents the population at large. Without a random sample, a researcher is unable to extrapolate their research to the population at large, and the value of research is diminished. Although this seems to be a convenient way to capture a sample of the correct people, it nullifies the results as being valid.

I have actually used the second technique called snowball research, which includes using sources from one article to find additional information on the subject. Previously, I had not known that this was not a proper way of conducting research. From this article I learned that using this research technique gives a narrow perspective on the subject and also can lead to serious gaps in an argument. Initially this seems like a great way to find additional sources but as mentioned in our textbook when we base our results on secondary research reports, they are inaccurate and should generally be avoided.

The third technique to avoid is snowball point-of-view. This is the tendency for individuals to test their hypothesis with positive examples as opposed to negative ones. This seems to be a social norm for people to find sources of information that prove their argument. The article mentions that this is typically found in political writing which leads to a biased perspective.

Although these techniques seem obvious when put on paper, they tend to be instinctive for novice researchers and those unaware of their consequences. Although snowball techniques are typically the most convenient forms of data collection, as we discussed in class these are non-representative. When results are non-representative they can not be reported to generalize an entire population. Generally, when the data can not support the population at large, it is useless.

Article Source:

Consumer Neuroscience

| No Comments

In class we discussed the various ways of measuring emotional responses with technology. Examples included: galvanic skin response, facial myography and blood pressure measurements. What is interesting about these types of measurements is that they are collected by subconscious responses from the participant. Traditional research methods are done at the cognitive level for measuring consumers thoughts, beliefs and attitudes towards particular advertisements and products. What is difficult about cognitive assessment is that participants do not always tell the truth or may feel prone to respond in a particular way to please the researcher.

The Nielson Company is a well known research institution. Nielson measures subconscious feelings with a process they call consumer neuroscience. Nielson uses this method for consumer research to determine a consumers non-conscious response to brands, products, packaging, in-store marketing, advertising and entertainment content.

Nielson uses neuroscience methods to measure brainwave activity in real time, capturing purchase considerations at the moment they are formed in the brain. These brainwave measurements include: electroencephalography, eye tracking and galvanic skin responses. Nielson also has an EEG headset that can expand testing environments beyond the laboratory and into a home or store. Their consumer neuroscience research is done by their NeuroFocus department.

Using neuroscience for consumer research seems to be an excellent way to capture what is actually occurring in a consumers brain while being exposed to a type of media/product. With traditional measures it is difficult to tell if participants are actually telling the truth. If results aren't "truthful" they are not exactly helping the initial research goals. With portable technology that can be worn in external environments for field research, neuroscience seems to be improving toward a more accurate way of measurement. Although cognitive assessments are still helpful in measurement, neuroscience is a good way to support what the participant may be saying or to determine new subconscious information about ad placement, feelings toward a product or emotional responses.

Screen shot 2012-11-16 at 1.15.23 PM.png

Company Site:

Finding Research With SNCR

| No Comments

SNCR (Society for New Communications Research) is a global nonprofit research and education foundation and "think tank". At one can find sources of independent research as well as commissioned and sponsored research projects in the site's online research directory. Topics regarding media, journalism, PR, social media, business strategy, and various social technologies have been studied and can be found in SNCR's research directory. SNCR is sponsored by HP, Jive, Middleberg Communications and SAP.

While visiting the site, users can also participate in studies currently being conducted by SNCR. These internet studies may be more biased towards those "more familiar" with research methods and as we discussed in class, internet sampling only acquires respondents willing to participate. These respondents may also be those typically in communications fields, otherwise there are not many other reasons to why people would visit the non-profit's site.

For future references SNCR seems to be a great source of information regarding its research library with various publications and case studies. My only speculation comes from the fact that many of their research surveys are found on their own site. It doesn't seem like a diverse population of people would be representative of that sample. It's difficult to see any other reason to why those not in communication fields would visit their website and participate in their surveys. From this inference, the sample seems to have more similar knowledge on communication strategies than those in non communication fields.

Screen shot 2012-11-16 at 12.11.05 PM.png

Site Link:

Research of Falling Groupon Stock

| No Comments

Screen shot 2012-11-16 at 11.36.25 AM.png

In November 2012, Groupon stocks fell to a historic low. Research company "Cloud 9" based in Boulder, Colorado set out to find why this trend occurred. Cloud 9 Living LLC is an industry-leading experience gift provider. They work with more than 500 small businesses in the tourism and hospitality industries across the U.S.

Cloud 9 Living LLC discovered Groupon's problem has to do with a serious negative image issue in the small business community. From this defined problem, Cloud 9 conducted a survey. The survey was directed towards a specific audience of small business owners. The article makes a valid point that Cloud 9's reasoning is directed towards Groupon's tourism deals specifically, and the businesses involved. This is important to recognize because the results of Cloud 9's survey can not be generalized to the entire population of small businesses, but small businesses in the tourism/hospitality sectors specifically.

The survey results reported only 3% of small businesses reported a significant increase in new loyal customers and only 14% saw an increase in revenue 6 months after the promotion. Cloud 9's research shows that although businesses can hold a successful "daily deal" promotion, small businesses owners see them negatively. But, it is important to notice again, this is regarding the tourism/hospitality small businesses. Cloud 9's research does not account for restaurant, beauty or fitness deals that may be regarded highly to small businesses using Groupon.

The research results reported seem to fit the initial problem, however, when one considers the actual statistics it may show something new. When you look at the 14% of businesses increase in revenue, although this seems like a low number, isn't any increase in revenue better than none? Not all small businesses use Groupon for increasing revenue. Some starting businesses use Groupon to inform customers of their new services. Although a large amount of businesses may not be seeing an increase in sales they should be reminded that these are coupon offerings. This is a businesses trade off of selling something for less to get people informed. They can not be expecting huge revenue increases by offering discounted deals. Especially if the discounted deals aren't "low enough" for customers to even consider on Groupon.

I do not believe the prime goal of Groupon deals are to increase a "ton" of revenue, but rather to entice the audience into an offering available. It is informing them of a deal, and the better the deal, the more Groupon sales. If a small business' primary goal is to increase revenue, then coupons for discounted deals are probably not their best option. It seems plausible that the most purchased deals on Groupon have the best discounts. It seems Groupon discounts are provided by businesses not looking to directly increase revenue, but to build buzz and inform Groupon lookers.

Information Source:

CJ Olson Market Research

| No Comments

Screen shot 2012-11-16 at 11.06.08 AM.png

CJ Olson Market Research is one of the oldest market research firms in Minnesota. Founded in 1983, this downtown-based Minneapolis firm focuses on helping organizations learn more about their customers, products, competitors and markets. CJ Olson Market Research claims to provide reasonable pricing to help businesses accomplish their objectives.

Similar to class discussion both qualitative and quantitative research methods are provided by CJ Olson. When companies do not have the expertise or time to research, they utilize services provided by firms such as CJ Olson. Various quantitative services such as survey design are created by CJ Olson's firm. They also provide survey's for particular outlets: telephone, email and mail. As we discussed in class these types of survey's vary in length and type of questioning, some more brief or confidential. Due to the fact that survey design is complicated and time consuming, CJ Olson holds expertise in this field and is willing to help any business with their research. CJ Olson also does survey analysis and reporting which are discussed in our class textbook. The accuracy of analyzing research results is crucial to reporting reliable results.

CJ Olson also conducts qualitative services such as focus groups and 1 on 1's. As we discussed in class these are effective when done properly and can give us new insights on ideas. CJ Olson also has skilled workers in moderating qualitative services. This is important for unbiased reporting and facilitating for proper qualitative research.

As discussed in class, research can be very helpful to organizations to learn more about their customers, market and products, however, when done incorrectly the research results may actually cause more damage than help the company/organization. Firms such as CJ Olson have reputable experience and ability to maximize research results and reliability to an organization looking to advance their business.

CJ Olson Company Site:

Census 2012 Prone To Misinformation?

| No Comments

An article provided by Business Report brings speculation to the 2012 census research findings. Business Report states that "statistical analysis must be carried out in a responsible manner. It requires methodological soundness, theoretical coherence and proper articulation of results". The article states that there are numerous oddities in the 2012 census data. Some people seem to be "unaffected" by these oddities showing no concern/ questioning about the results, while others have pointed them out and consider the census to be misinforming, therefore questioning the census 2012 in its entirety.

One of the oddities discussed regards measurement and reporting comparisons in 2012 household income findings by comparing to 2001 census data. The 2012 census refers to the census 2001 (which used different measurements) to claim household income has doubled since 2001 for a particular demographic. This can be described in our book as using different "metrics". In 2001 the census household income was based on personal income and categorized in 12 income classes, whereas census 2012 refers to the household's entirety income. These differences were not stated in the 2012 census, but were realized by various analysts.

This seems to be a fault in research reporting with statistical analysis. The census is interpreting information with different metrics and comparing them to show a relationship. If the 2012 census wanted to accurately report an increase in income, what should have been compared in 2012 are individual incomes to show a reliable and valid relationship over time.

We talked about reporting research in our textbook and how it needs to be accurate, without bias and without reason to misinterpret data. Based on this article, it seems that the census used different metrics to compare data, which can not be done. It was interesting to learn how analysts caught this "misinformation" and that the general public would be more likely to not notice the fault in 2012 census reporting. This article shows how research institutions can sometimes bias the information to prove a point when the actual way of reporting relationships is not valid. In this case it was in terms of metrics, using different formats of data collection to draw a conclusion and show a relationship, when the formats of data collection can not be compared.

Article Source:

Advertising on Tablets

| No Comments

Business Insider released an article on November 8th, 2012 claiming tablet ad spending will crush smartphones over the next four years. A chart displayed how specific mobile budgets will be divvied up between mobile phones and tablet/eReaders. Research done by the International Data Corporation (IDC) predicted what ad spending will look like through 2016 on various mobile and tablet formats. IDC predicts the $2.1 billion industry in 2011 will grow to a whopping $14.8 billion come 2016. IDC predicts while mobile phones still dominate, tablets will be on the rise much faster.

Three reasons have been speculated by IDC to why this shift is projected to occur:
1. Tablets have more traffic because they are easier to use
2. Tablet users engage more with ads than their smartphone counterparts.
3. The size difference "sweetens" the deal.

The article fails to recognize what type of research was completed to reach the conclusion that tablet advertising will be on the rise, however, because IDC is predicting a future behavior they may have done an experiment. The independent variable would be the type of advertising (smartphone or tablet) and the dependent variable would be the amount of advertising. They could have completed this experiment with businesses/organizations who advertise while measuring their projected usage of advertising dollars when changing the independent variable and measuring the dependent variable.

It is also unknown which type of research was done to come to the three conclusions for why tablet advertising is suspected to be on the rise. These conclusions seem to be directed towards consumer usage so it is likely that a survey was completed to acquire information on how consumers use their devices and for what reasons. A survey could generalize to the population with a known level of confidence.

The article gives a research institute's (IDC) perspective on how advertising sales dollars will be allocated in the future, however, fails to point out how these speculations have been formed. It is interesting to notice how many articles across the web provide statistics and speculations, yet fail to include how these numbers were found. I am not sure if this means articles assume the reader will accept the information without speculation, or if the research was not done properly to report.

Due to the fact that smartphone and tablet users are a special population, it raises the question of how was the sample drawn? If a survey was done to come to the three conclusions of why tablets will draw in more advertising dollars, who was chosen and how were they chosen to complete the survey. Were they tablet/smartphone owners or not? Depending on if they own one, or the other, or both, their usage will depend and would largely influence the results of a survey that is meant to be generalizable.


Article Source:

SDSU's New Research Institution

| No Comments


On November 14th, San Diego State University announced the opening of a new integrated regenerative research institute. An emerging medical field regarding tissue regeneration encompasses the power and healing potential of stem cells. The new institution brings together researchers from across the world from the private and public sectors to advance the area of tissue regeneration.

Researchers from SDSU and UCSD come together at this institute on a shared goal of understanding the fundamental processes of heart protection and regeneration with a long-term vision for therapeutic strategies. Tens of millions of dollars in funding from the NIH (National Institutes of Health), The National Science Foundation and the American Heart Association are committed in this institution. Their investment of regenerative medicine aims to teach the next generation of researchers.

This blog entry differs from previous entries. Previous entries focused on reported data and findings whereas this entry looks at the source of where information will be coming from. Although this research institution is science-based, more than communication-based, it has a communication goal of informing incoming/upcoming researchers. It makes the point that it hopes to teach researchers from all levels "from undergraduate to post-doctoral to junior faculty" about regenerative tissues.

Much of the research at this institution will be done to advance medicine, however, the procedure is similar to setting up research in communications. Before starting research, the SDSU center will have to go through the Institutional Review Board, which is discussed in our textbook on page 50 under research ethics. Under this "Common Rule", informed consent with patient compliance will be needed for this IRB.

Many of the ethical concerns in communication research are relative to the new SDSU institution. Patients will need to have the right to confidentiality or anonymity. Informing and reporting of research done at this institution will need to follow the guidelines of proper literature reviews and proprietary information. Reports need to acknowledge others, use appropriate language and avoid plagiarism. These are all aspects of communication research that will apply to this new research institution at San Diego State University.

Article Source:

Amazon Conducts Tablet Survey

| No Comments


A day after Apple released its iPad mini, Amazon sent out a survey to their email subscribers asking them to weigh in on the current state of the tablet market. The survey focuses on how consumers consider various tablet features and how much consumers actually know about the variety of devices on the market from Apple, Google, Amazon and Samsung. An incentive to take this survey was a $5 gift card from Amazon to use on their site.

I think this seems like a good idea for Amazon. Not all Amazon users utilize the site for technological devices so they get a pretty large range of people to survey being that Amazon sells a large variety of products. It is also a good incentive to give out the gift card. As we learned in the textbook, surveys seem to be more effective at generating responses when incentives are included. Results of the survey have not been released, however, Amazon may never release the survey results and use what they found to better advertise their own tablet.

A screenshot of the Amazon Tablet Survey is shown below:
Screen shot 2012-10-25 at 10.29.09 AM.png

Information Source:

America's Facebook Generation Is Reading Strong

| No Comments

A study by Pew Research center was discussed by NPR in an article. They claims the fact seems surprising that the Facebook generation of young adults seems to be reading more books than believed, states the article. Pew's study found that 60 percent of Americans under 30 used the library in the past year. In the reading habits of young Americans, 8 in 10 Americans under age 30 have read a book in the past year. This is compared to 7 in 10 adults in general. The study summarizes that [young adults] "Are reading, more likely to read and more likely to use a library in comparison to adults".

The problem with this study seems to be a little obvious. Being that there are more young adults in college, we can already assume they are reading more than adults who are busier with full-time jobs and raising children. These college students are also spending more time in libraries studying for exams and doing homework. As many students live near libraries on college campuses, or close to college campuses, it seems likely to state they are in libraries more than adults because they have close access. There is also no baseline to compare that current young adults are reading more than in the past so we can't say young adults are reading more during this "Facebook generation" compared to the former "non-Facebook generation". Where is Pew Research getting this baseline of "non-Facebook" users? There is no article/abstract to formulate the fact that current Facebook users are reading more.

The article doesn't explain how results were gathered from a population. We cannot infer the validity of the results when we don't know how the population was sampled. It also does not explain how frequently or when exactly the results were recorded. This lacks the reliability measure. Were the results taken each month? Or were they concluded after only one month? The article does not give this background information therefore the results seem confusing and improperly summarized in this article by NPR.

Article Source:

Screen shot 2012-10-25 at 10.02.11 AM.png