↓ Skip to main content

A comparison of smartphones to paper-based questionnaires for routine influenza sentinel surveillance, Kenya, 2011–2012

Overview of attention for article published in BMC Medical Informatics and Decision Making, December 2014
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
3 X users

Citations

dimensions_citation
43 Dimensions

Readers on

mendeley
140 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
A comparison of smartphones to paper-based questionnaires for routine influenza sentinel surveillance, Kenya, 2011–2012
Published in
BMC Medical Informatics and Decision Making, December 2014
DOI 10.1186/s12911-014-0107-5
Pubmed ID
Authors

Henry N Njuguna, Deborah L Caselton, Geoffrey O Arunga, Gideon O Emukule, Dennis K Kinyanjui, Rosalia M Kalani, Carl Kinkade, Phillip M Muthoka, Mark A Katz, Joshua A Mott

Abstract

BackgroundFor disease surveillance, manual data collection using paper-based questionnaires can be time consuming and prone to errors. We introduced smartphone data collection to replace paper-based data collection for an influenza sentinel surveillance system in four hospitals in Kenya. We compared the quality, cost and timeliness of data collection between the smartphone data collection system and the paper-based system.MethodsSince 2006, the Kenya Ministry of Health (MoH) with technical support from the Kenya Medical Research Institute/Centers for Disease Control and Prevention (KEMRI/CDC) conducted hospital-based sentinel surveillance for influenza in Kenya. In May 2011, the MOH replaced paper-based collection with an electronic data collection system using Field Adapted Survey Toolkit (FAST) on HTC Touch Pro2 smartphones at four sentinel sites. We compared 880 paper-based questionnaires dated Jan 2010-Jun 2011 and 880 smartphone questionnaires dated May 2011-Jun 2012 from the four surveillance sites. For each site, we compared the quality, cost and timeliness of each data collection system.ResultsIncomplete records were more likely seen in data collected using pen-and-paper compared to data collected using smartphones (adjusted incidence rate ratio (aIRR) 7, 95% CI: 4.4-10.3). Errors and inconsistent answers were also more likely to be seen in data collected using pen-and-paper compared to data collected using smartphones (aIRR: 25, 95% CI: 12.5-51.8). Smartphone data was uploaded into the database in a median time of 7 days while paper-based data took a median of 21 days to be entered (p¿<¿0.01). It cost USD 1,501 (9.4%) more to establish the smartphone data collection system ($17,500) than the pen-and-paper system (USD $15,999). During two years, however, the smartphone data collection system was $3,801 (7%) less expensive to operate ($50,200) when compared to pen-and-paper system ($54,001).ConclusionsCompared to paper-based data collection, an electronic data collection system produced fewer incomplete data, fewer errors and inconsistent responses and delivered data faster. Although start-up costs were higher, the overall costs of establishing and running the electronic data collection system were lower compared to paper-based data collection system. Electronic data collection using smartphones has potential to improve timeliness, data integrity and reduce costs.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 140 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 1 <1%
Canada 1 <1%
Unknown 138 99%

Demographic breakdown

Readers by professional status Count As %
Student > Master 34 24%
Researcher 20 14%
Student > Ph. D. Student 15 11%
Student > Bachelor 12 9%
Student > Doctoral Student 10 7%
Other 22 16%
Unknown 27 19%
Readers by discipline Count As %
Medicine and Dentistry 37 26%
Nursing and Health Professions 15 11%
Computer Science 14 10%
Social Sciences 9 6%
Psychology 7 5%
Other 23 16%
Unknown 35 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 02 September 2015.
All research outputs
#14,720,444
of 23,577,761 outputs
Outputs from BMC Medical Informatics and Decision Making
#1,123
of 2,027 outputs
Outputs of similar age
#190,871
of 356,735 outputs
Outputs of similar age from BMC Medical Informatics and Decision Making
#21
of 39 outputs
Altmetric has tracked 23,577,761 research outputs across all sources so far. This one is in the 35th percentile – i.e., 35% of other outputs scored the same or lower than it.
So far Altmetric has tracked 2,027 research outputs from this source. They receive a mean Attention Score of 4.9. This one is in the 38th percentile – i.e., 38% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 356,735 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 44th percentile – i.e., 44% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 39 others from the same source and published within six weeks on either side of this one. This one is in the 35th percentile – i.e., 35% of its contemporaries scored the same or lower than it.