ГРАЖДАНСКАЯ НАУКА И ЦИФРОВИЗАЦИЯ
Daria S. Bylieva
Associate Professor, Department of Social Science, Peter the Great St. Petersburg Polytechnic University,
St Petersburg, Russia; e-mail: [email protected]
Victoria V. Lobatyuk
Associate Professor, Department of Social Science, Peter the Great St. Petersburg Polytechnic University,
St Petersburg, Russia; e-mail: [email protected]
Anna V. Rubtsova
Director of the Graduate School of Applied Linguistic, Interpreting and Translation, Peter the Great St. Petersburg Polytechnic University,
St Petersburg, Russia; e-mail: [email protected]
Gitizen Science: Concept, Problems and Prospects
УДК 001.81
DOI: 10.24411/2079-0910-2021-11004
Thanks to information and communication technologies, citizen science today is becoming a powerful tool worldwide. It allows scientists to obtain data of unprecedented scale. However, despite the huge potential and quite a long history of use, there are still serious problems that pose an obstacle to obtaining scientifically reliable data because of insufficient qualification of citizens and sampling biases in biodiversity studies. The article presents methods of overcoming these limitations with the purpose to implement projects that require highly qualified participants. Principles of motivational model design and communication policy for citizen science projects participants should be adequate to the scientific tasks, their scope and complexity.
© Daria S. Bylieva, Victoria V. Lobatyuk, Anna V. Rubtsova, 2021
Keywords: citizen science, research motivation, citizen scientists, public participation, science engagement.
Acknowledgment
The research was carried out with support from the Russian Foundation of Basic Research (RFBR) according to the research grant No. 19-111-50614.
The concept of citizen science
Nowadays information and communication technologies development has made citizen science a powerful tool for research activities around the world in diverse scientific fields. Technology innovations in Web 2.0 era determine tremendous capacity of domestic Internet connections, the reducing costs of sophisticated mobile devices with numerous sensors and computer storage, the continued development of Internet technologies and standards such as extensible Markup Language (XML) that facilitate the transfer of information between computers; the increased accuracy of the Global Positioning System (GPS), the growth of sophisticated Web applications such as the virtual microscope [Gashkova et al, 2018; Haklay, 2013] or small add-on for smartphones measuring properties of aerosols [Land-Zandstra et al., 2016] or connecting a smartphone to a bat detector to record ultrasonic bat sounds [Mac Aodha et al., 2018]. The scale of the projects is impressive, for instance eBird alone gathered more than 100 million records of individual species of birds (i.e., species observations) from 252 countries per year [Kelling et al., 2019].
What do we mean by citizen science? The term "Citizen science" was first used by Alan Irwin in 1995 as the title of the book discussing issues ofjoint forces of citizens and scientists to tackle environmental problems [Irwin, 1995]. The term civic science was used in the close sense as an effort to democratize science by involving citizens as researchers [Shannon, Antypas, 1996], emphasizing community-based monitoring and management — community science [Carr, 2004]. However, in the future, despite the fact that a number of scientists continued to emphasize the possibilities of citizen science in the context of the influence of citizens on politics, providing models for more participatory forms of knowledge production and policy making [Fischer, 2000; Ottinger, 2010]. Citizen science has become primarily an effective means of achieving scientific results with the help of citizens. In 1996, Rick Bonney from Cornell's Laboratory of Ornithology writes about "civil science" in the meaning of public participation in research [Bonney, 1996]. In most definitions of citizen science, two key points can be distinguished: the participation of 7non-professionals / volunteers and the large number of people participating in a scientific project. Moreover, it is the first point that is usually emphasized in the definitions. At the same time, we must not forget that two centuries ago almost all scientists made their living in some other profession [Silvertown, 2009, p. 467]. In fact Darwin traveling on a ship like unpaid companion to Captain Robert FitzRoy, Benjamin Franklin and Thomas Jefferson may be called unprofessional scientists or "amateur" [Dickinson et al., 2012, p. IX]. Mass participation in scientific projects also has a rather long history. The first scientist to recruit masses of volunteers to measure tides around the clock at the same time for two weeks in June 1835 was William Whewell who made major discoveries in predicting ocean tides [ Cooper, 2016, p. 4]. Amateur astronomers
founded the Astronomical Society of Pacific in 1889, National Weather Service Cooperative Observer Program began in 1890, a Christmas Bird count of National Audubon Society has been operating since 1900 up to now. The third key point is the direct participation of citizens in the study that distinguishes citizen science from other projects related to the participation of citizens in science, but not implying the participation of volunteers in scientific activity. Such projects without participating in scientific activity are being a subject in a research study, providing computing resources for projects, include SETI@home and crowdsourcing. However, some researchers may consider the last two types of participation as part of citizen science [(Curtis, 2018; Haklay, 2013]. Moreover, Thiel et al. [Thieletal, 2014] considers that all people who help in marine research are civilian scientists (transport scientists, launching scientific equipment, carrying scientific equipment on their vessels, taking samples and passing them on to scientists). Hybrid studies are emerging, for example, in the field of social psychology or in the research of the diet effect [Klimenko et al., 2018], where the study group is considered as scientists receiving significantly more information than ordinary respondents and participating in the development of research hypotheses [Makhnach et al, 2019].
In the Oxford English Dictionary, the term citizen science appeared only in 2014 with the definition "scientific work undertaken by members of the general public, often in collaboration with or under the direction of professional scientists and scientific institutions" [Eitzel et al., 2017, p. 5]. Similar terms to citizen science are concepts used in specific subject areas, for example, 'volunteered geographic information (geography [Goodchild, 2007; Haklay, 2013]), 'community-based participatory research (environmental justice contexts [Minkler, Wallerstein, 2011]), local knowledge (public health research [Corburn, 2005]). There are less popular terms combining the diverse possibilities for citizen participation in science — Public Participation in Scientific Research [Bonney, Ballard et al., 2009; Shirk et al., 2012] or Community and Citizen Science [Ballard et al., 2017]. In some non-English countries, it is difficult to translate the term citizen science, for example, if citizen science is associated with "science about being citizen" — Kodanikuteadus (Estonia) [Eitzel et al., 2017, p. 8] or as the opposite of military or related to citizenship — Grazhdanskaya nauka (Russia), and in China this term can be translated as 'public science'.
Types of citizen science
The citizen science classification may be based on various grounds.
— Duration [Ballard et al., 2008)]: most often, citizen science projects involve long-term cooperation, but recently, alternative options have appeared, for example, short-term activities lasting as little as 48 hours [Reeves, Simperl, 2019]. Projects exceeding a decade are considered long-term. For example, since 1948 — of cetacean stranding data [MacLeod et al., 2005], since 1970 — on cetaceans in Europe [Peltier et al., 2012]. Midterm monitoring programs last several years, short time — several months or weeks, or even single opportunities.
— Aims: educating and empowering people to address scientific questions that interest them, educating people first-hand about conservation issues, connecting urban populations with nature, centralizing critical information on the distribution and abundance of organisms, and bringing people into scientific professions [Cooper et al., 2008].
— The role of technologies (primarily information and communication): from purely virtual projects (where data, processing, and results are on the Internet — which can be called citizen cyberscience [Curtis, 2015], virtual citizen science [Wiggins, Crowston, 2011], online citizen science [Curtis, 2015]) to those where the data collection from the physical environment is subjected to non-computer analysis, and information and communication technologies play only an auxiliary role. Additional features are provided by the use of mobile phone sensors (transceivers (mobile network, WiFi, Bluetooth), FM and GPS receivers, camera, accelerometer, digital compass and microphone, plus link to external sensors). For instance, it can be used for sensing of air-quality GasMobile, AirProbe etc. [Cuff et al, 2008; Sirbu et al., 2015], noise levels NoiseTube [Maisonneuve et al., 2009, 2010] and even for linkage of different locations to wellbeing Mappiness [MacKerron, Mourato, 2013].
— Special requirements for volunteers. Although most of the projects in citizen science do not specify the characteristics of volunteers, in some cases special knowledge, skills, occupation, etc. are required. Thus, in a number of cases, fishermen are involved in marine research [Grant, Berkes, 2007; Le Fur et al., 2011], hunters or divers [Edgar, Stuart-Smith, 2009]. There are projects specifically organized for Schoolchildren or for undergraduates [Eastman et al., 2014; Osborn et al., 2005] or opposite oriented to highly trained biology graduate students [Azzurro et al., 2013].
— Complexity degree of the tasks / Necessity of training are two interrelated criteria.
The more complex the task the more intense the training of volunteers should be.
For virtual projects, the complexity will vary from watching and intuitively understanding a training video (classify / process / recognize the collected data (as in Zooniverse)) to building models, such as gravitational lenses [Kung, 2018; Kung et al., 2018]. In real-world projects, from sensor data collection [Mahajan et al., 2020] to analysis of samples (monitoring a stream and benthic community [Brooks et al., 2019; Gowan et al., 2007; Lowry et al., 2019]. M. Haklay's [Haklay, 2013] classification suggests a division into crowdsourcing, distributed intelligence, participatory science and extreme citizen science, where the latter implies the most difficult participation of volunteers, considering that citizens conduct research almost independently with minimal support from scientists [Stevens et al., 2014]. On the contrary, for virtual projects that do not require serious volunteer training and allow you to participate from time to time, contributing to existing databases, the following titles are offered: "opportunistic citizen science" [Martin et al., 2016]) or "volunteer thinking"/"distributed thinking" [Curtis, 2015].
An interesting classification in terms of the stages of scientific research in which volunteers participate is proposed by Wiggins and Crowston [Wiggins, Crowston, 2011]: in contributory model volunteers take part in data collection (probably analyze data and disseminate results), in collaborative model — data collection, analysis of samples and data (probably they design the study, interpret data, draw conlusions and disseminate results). In co-created model people take part in all stages of scientific research (define question, gather information, develop hypothesis, design study, data collection, analyze samples, analyze data, interpret data, draw conclusions, disseminate results, discuss results & ask new questions). Several scholars indicate that when collecting data, volunteers can contribute significantly more than workforce, provided that non-professionals have special skills or knowledge, and sometimes new insights [Bonney, Cooper et al., 2009; Wiggins, Crowston, 2011; Foster-Smith, Evans, 2003, pp. 211-212].
Citizens' participation in science can be considered in another aspect — as a public request for research, in the terminology of the author "community science": community consulting, community-defined research, community workers, and community-based participatory research [ Wilderman, 2007]. Combining the last two aspects of volunteer participation, Shirk et al. [Shirk et al, 2012] offer the following classification according to volunteer's activity: contractual projects (communities ask professional researchers to conduct a certain investigation); contributory projects, (scientists develop, citizens contribute data); collaborative projects (in addition to the previous one — citizens participate in additional activities, refine project design, analyze data, and/or disseminate findings); co-created projects (designed jointly by scientists and members of the public, citizens are actively involved in most or all aspects of the research process), collegial contributions, where non-credentialed individuals conduct research independently with varying degrees of expected recognition by institutionalized science and/or professionals [Shirk et al., 2012].
We may identify comprehensive clustering-based classifications indicating the basic types of citizen science projects. Wiggins and Crowston suggest differences between Action, Conservation, Investigation, Virtual and Education projects [Wiggins, Crowston, 2011].
Quality problems of data received from participants and their solutions
Despite the attractiveness of using citizen science, there are a number of problems appearing in scientific research. The first problem that made scientists doubt in effectiveness of citizen science is the challenge of scientific reliability of data, due to lack of knowledge or inaccuracy of research.
Validation and verification are two complementary ways of scientific data processing collected by volunteers. Validation may be defined as the process of checking if specific information satisfies a defined criterion and can therefore be interpreted correctly so that its subsequent use will be free of misinterpretation. Verification is defined as the process designed for checking data accuracy by means of statistical methods [ Tweddle et al., 2012].
The quality of data received from non-professionals can vary significantly due to wide variety of behaviour, experience, and capabilities [Bird et al., 2014; Cohn, 2008]. However, Cohn highlights for instance that crabs along the Atlantic Coast from New Jersey to Maine seventh graders got right 95 percent of the time, and even third graders were right 80 percent ofthe time (it is an acceptable accuracy rate for most ecological studies) [Cohn, 2008; Delaney et al., 2008]. In more complex cases, when volunteer divers explore densities of fishes and macroinvertebrates, 15% could not reach an index of similarity between the quality of work of a volunteer and an instructor of at least 90% [Edgar, Stuart-Smith, 2009, p. 54]. However, as a result, no significant differences between data produced by volunteers and professionals were evident for any of the community metrics examined, including estimates of numbers of species per transect, total faunal densities of animals per transect and mean size of fishes [Edgar, Stuart-Smith, 2009, p. 60].
Studying in 2011 practical ways of validation or quality control used in research projects of citizen science [Wiggins et al., 2011] found that the most common method is expert review, followed by photo submissions. The third of the projects required the submission of paper data sheets along with online data submissions. Among other techniques are Quality assurance project plans, Repeated samples/tasks, Participant tasks involving control items, Uniform or calibrated equipment, Personal knowledge of participant skills/expertise,
Participant training, Participant testing, Rating participant performance, Filtering of unusual reports, Contacting participants about unusual reports, Automatic recognition techniques, Digital vouchers, Data triangulation, Data normalization, Data mining, Data quality documentation.
For verification, one may use a comparison of volunteer data with the results obtained by experts or control with the variety of statistical methods. For example, in a study on abundance and diversity of crabs, counts made by volunteers were excluded from analysis if they differed by more than 3.5% from counts made by scientists [Culver et al, 2010]. In studying imposex in the dogwhelk Nucella lapillus from North Sea correlation analysis of random expert and volunteer ratings was carried out and made up 0.699 and 0.865 [Evans et al., 2000]. In research of distribution and abundance of small plastic debris on beaches data are compared with control material of Student's test (for paired or independent samples depending on the grade of independence of the replicates) [Hidalgo-Ruz, Thiel, 2013]. In the study of habitat variables, species richness, and abundance correlation test (Spearman rank and Cronbach's alpha) and Czekanowski proportional similarity index [ Goffredo et al., 2010] were used. In the study of microplastic abundance, volunteer data were contrasted with recounts made in the laboratory with the regression analysis. After revealed three data points that fell outside the 95% confidence interval; the correlation coefficient increased from 0.681 to 0.984 [Hidalgo-Ruz, Thiel, 2013]. For measuring proportional agreement between the two sets of data Kappa coefficient is used [Assis et al., 2009; Monk et al., 2008]. For water monitoring Statistical comparisons were applied (correlation analysis, PE, Student's t-test, and means' variability). Between data of Together4Water and the reference station projects, we observe the following difference: Pearson Correlation Coefficient ranges between 0.91 and 0.98 for all citizens [Fehri et al., 2020].
The most difficult problem for volunteers is the identification of species. It leads to uneven species detectability, including false positives — misidentifications of observed organisms, and false negatives — failures to report species that were present. False negative errors can usually be adjusted by direct estimation of detection probabilities [MacKenzie, 2018]. False positives in some cases can be eliminated if that fall outside the norm of occurrence for a species at a particular time or space. However, hierarchical models are used to overcome false absences, false presences [Royle et al., 2012], and species misidentifications [Conn et al., 2013]. However, in some cases, in order to correct false positives in statistical analyzes, it is necessary to use additional sources of information, such as testing participants about species identification abilities or validation approaches that estimate observer skill levels [Kelling et al., 2015]. An interesting alternative is collaborative data quality management practices, where validation of previous data is done by the participants themselves [ Wiggins, He, 2016]. For example, users assess the correctness of data. Status of various users' reports at different times are tracked in the database, that ultimately allows checking complet report and verifying the contributed data [ Tiufiakov et al., 2018].
Today, there are great technical opportunities that provide quality improvement of the volunteers' contribution. Mobile technology can help avoid some errors by automatically transferring some data [Parrish et al., 2018]. Special mobile applications can give greater precision effect, for example, mPING [Elmore et al., 2014], or smart sensors [Chen et al., 2017; Mahajan et al., 2020] and the use of automatic information filters. For example, in eBird and FeederWatch applications more than 600 geographic and numeric data quality filters are used, which allow rapid data review and electronic communication with observers to validate questionable observations. For instance, "Smart filter" system in FeederWatch
has a checklist of "allowable" species for each US state and Canadian province. In case of violation maximum allowed for the species/month/region combination and/or species that did not appear on the standard state/province checklists, a message is sent to the participant offering either to correct or confirm the data [Bonter, Cooper, 2012]. Among the new approaches using Machine learning methods is Hybrid Expert Ensemble System (HEES) that combines an Expert System (ES) and machine induced models [Wessels et al., 2019].
In addition, common source of bias is observer-level variability in sampling. In the former, the accuracy of the individual participant is secondary to the "wisdom of the crowd" (the mean value of a measurement made by observers may be centered on the true value) emerging through the use of voting or aggregation algorithms [Edmondson et al., 2012]. However, the significance of the contribution of individuals who are inclined to overestimate or underestimate indicators may affect overall results [Bird et al, 2014]. The technical solutions to the problem in virtual projects are aggregation algorithms, which can give more weight to those who are more accurate, and/or who contribute more [Marshall et al, 2016]. However, for large-scale projects, based on off-line observations, the analysis of participants allows you to see trends. For example, the most active observers in eBird not only submit more checklists, but also report birds that are harder to identify [Kelling et al., 2015], that helps to calibrate human "sensors" [Kelling et al., 2015; Rachlin et al., 2011].
Sampling biases problems in biodiversity projects and their solutions
Projects related to biodiversity have a long history and have been especially popular recently due to their wide technological capabilities, but they are not able to replace humans in classifying organisms to species consistently.
However, in the field of projects related to the identification and description ofbiological species, there are specific problems associated with obtaining nonrandom sampling opportunistic data that do not fit into the probabilistic sampling scheme, and therefore not representative. We distinguish the main types of biases:
1) preferences for certain species: particularly in over-reporting of rare species, under-reporting of common species [Troudet et al, 2017; Tulloch, Szabo, 2012], underestimation of small birds [Kamp et al., 2016];
2) temporal biases (time of day, season [Ahrends et al., 2011; Peterson et al., 2008], and weather [Oliveira et al., 2018], all of which can affect the detectability of species [Ellis, Taylor, 2018]);
3) spatial bias (preference for the area next to the house [Dennis, Thomas, 2000; Geldmann et al., 2016; Mair, Ruete, 2016], easily accessible [Botts et al., 2011; Kadmon et al., 2004; Lawler, O'Connor, 2004; Reddy, Davalos, 2003], with great species diversity [Hijmans et al., 2000], in reserves [Hornsten, Fredman, 2000; Tulloch et al., 2013];
4) uneven intensity of observation / recorder effort (change in the number of observations over time (existing trends may be masked or false trends appear because of a simple trend in observation effort [ Gu, Swihart, 2004; Kery et al., 2006], omission by the volunteer of all available species [van Strien et al., 2010]).
By and large, to obtain relevant data, you can either use structured surveys that emphasize strict protocols designed, or in Semi-structured projects, make subsequent data adjustments.
The first direction implies that the research program is very strictly regulated. Indeed, some of the problems can be solved by standardizing sampling effort and repeated research in the same places. However, strict regulation of places and time of data collection will scare away volunteers, therefore some program managers scaffold participation, recruiting a large number of participants to collect incidental observations while funneling a subset of very committed volunteers into stricter, more labor intensive protocols [Dickinson et al, 2010]. An example of strict standardization is the Swiss CommonBreeding Bird Survey ("Monitoring H"aufige Brutv"ogel"), where a simplified territory mapping protocol and a specified transect route, each square is surveyed three times during the breeding season or Breeding and Wintering Birds of Britain and Ireland [Balmer et al., 2013]. A standardized approach ensures high accuracy of information, but it requires availability in sufficient quantity, appropriate quality and geographic span [Edgar, Stuart-Smith, 2009, p. 52]. However, it is clear that with tightening requirements the amount of data will decrease.
The second direction involves the processing of the received data. Processing can take place either in the form of filtering or statistical processing. The latter involves the study of the characteristics of the activities of volunteers, which can create certain bias and becomes the second studied factor in addition to the collected data. In this context, there is an interesting study [Heilmann-Clausen, L&ssoe, 2012], which indicates that the observed increase in host range diversity for wood-inhabiting fungi is not related to a change in their livelihoods [Heilmann-Clausen, L&ssoe, 2012], but in the change in the features of collecting data (even despite the so-called species accumulation effect — measured species diversity increases with effort [Henderson, 2003]. The comparative study [Johnston et al., 2019] demonstrates that inclusion of effort data as covariates is better for making reliable inferences, than inferring checklists with non-detections complete or non-detections effort, conducting spatial subsampling and filtering the data by effort variables.
Kery et al. [Kery et al., 2010] offer to include in the model «observation effort» to refine the data (which is represented by the parameter for detection probability as strings of detectability of an occupied site on a quadrat-byday basis without aggregation by site and year), to distinguish species distribution and species detectability [van Strien et al., 2010]. They also suggest in addition to detection probability to take into account the fact that many collectors of opportunistic data do not report all species detected. An interesting approach for evaluating recorder effort is to use the proportion of a suite of common species ('benchmark species') found at a given location and time [Hill, 2012]. In general, statistical models can be used to adjust data: linear, additive, mixed-effects or hierarchical, taking into account the influence of predictors (often groups) that increase variability in the data [Bird et al., 2014, pp. 12-13; Miller et al., 2019]. Geldmann et al. [Geldmann et al., 2016] suggest the usage of point process model (PPM) instead of grid— or polygon-based analysis that allows you to take into account the features of the infrastructure (for example, roads) which affect spatial bias. Another direction suggests using a probabilistic model for joint analysis of presence-only and presence-absence and other data sets collected via systematic surveys (albeit less numerous) to overcome spatial bias [Fithian et al., 2015; Giraud et al., 2016].
Machine-learning models (boosted regression trees, random forests, artificial neural networks, genetic algorithms) allows you to use more predictors, have non-linear relationships between predictors and responses, and other features. Particularly bias in spatially or temporally can be overcome in linear and additive mixed-effects models with the help of variance weighting to reduce the importance of undersampled areas or down-weighting heavily sampled areas to reduce their influence in models [Dudik et al., 2006], or
by Hierarchical Bayesian approaches for explicitly model sampling in relation to a latent underlying process [Latimer et al, 2006].
It should be noted that both of the above approaches take little account of the personality and active principle of the volunteers.
It is possible to use the existing motives and preferences of volunteers to complement the "problem" parts of the study. For example, Tulloch and Szabo investigated the movements of civilian researchers, classified their research strategies, which, in particular, revealed a group of roaming volunteers who purposely take long trips (at least 300 km from the major urban centre, and 1900 km from their home base on average) to find interesting species, or species that are on the verge of destruction [Tulloch, Szabo, 2012, p. 323]. The intermediate adaptive approach is used, for example, in the South African Bird Atlas Project 2, where the maps allow you to see 'under-surveyed' grid cells (the map is divided into cells, the color intensity of which depends on the amount of protocol in this area) and determine where to visit next.
Motivation to participate in citizen science projects
Volunteers' interest to citizen science projects is irregular. Some projects involving exiting topics very quickly get massive citizen support. Many people like to observe birds or distant galaxies. In the study of volunteers' motivation in the Galaxy Zoo project motives specific to this topic are noticeable. The second most popular motivating factor was interest in astronomy. In addition, beauty was also among the popular motives, and the vast scale of the universe, the opportunity to see galaxies that few people have ever seen [Jordan Raddick et al., 2013]. Another reason of appeal may concern social and political importance of the project. In the case of environmental problems, one of the most important motives is wish for solving the problem [Geoghegan et al., 2016] or altruistic desire "greener sense of self" [Toomey, Domroese, 2013]. For example, in projects related to the study of plastics pollution, many participants indicate the importance of the fact that data can influence policies [Rambonnet et al., 2019]. In surveying turtle nests project the major motive was a desire to protect the turtles [Mankowski et al., 2011]. Attachment to a particular topic may also be the result of the formation of a specific individual's identity (e.g. seeing oneself as a 'birder') and a specific community context (e.g. living in a town that is a bird sanctuary) [Jones et al., 2018, p. 289]. Shevchenko [Shevchenko, 2018] points out that factual and methodological approaches to science do not allow us to identify a participant in the citizen science project, who turns out to be indistinguishable from a computer or bacteria solving the same problems, in particular, the selection of the spatial structures of a protein. Only axiological approach reveals a person, his goals and values.
Numerous behavioral, psychological theories are used to analyze the behavior of participants, for example, the theory of planned behavior (TPB) [Martin et al., 2016], Self Determination Theory and Cognitive Load Theory [Miller et al., 2019], self-determination theory and identity theory [Jones et al., 2018].
Many studies of the last decade are devoted to the motivation of the volunteers. Motivation can be interpreted as intrinsic (those which stem from the task itself) and extrinsic (the outcomes of an activity) [Eveleigh et al., 2014]. Reed et al. single out among the main motivating factors social engagement (communicate awareness of and interaction with others), interaction with website (a sense of awareness, facility, and enjoyment) and helping (positive feelings from helping or volunteering) [Reed et al., 2013, p. 617]. In some
studies, the first is desire to "contribute to original scientific research" [Jordan Raddick et al, 2013]. According to the survey among the citizen scientists, the main motives are the desire to contribute to science: (43.69%), social and community involvement (13,01%), learning (10.80%), interest (10.29%), enjoyment (9.59%), fun (8.27%), discovery (1.00%) [Jones et al., 2018, p. 300]. Nov, Arazy, and Anderson suggest division of motivation into collective, norm-oriented, reward, collective identification, intrinsic [Nov et al., 2011, p. 69]. Rotman et al. highlight such motivational factors as altruism, collectivism, principalism and egoism. The latter is interpreted as personal benefit and will dominate in the early stages of the participation in the project [Rotman et al., 2012]. The most common primary benefit consist in the extension of science literacy and knowledge. Nevertheless, the questions of a growing understanding of the essence of scientific research and the acquisition of new scientific knowledge remain open. Many researchers do not find a growth in understanding of scientific research, but significant differences in knowledge ofbird biology as compared to a control group. Brossard, Lewenstein, and Bonney [Brossard et al., 2005] assert that no statistically significant change in participants' attitudes toward science or the environment, or in participants' understanding of the scientific process could be detected. Jordan, Gray, Howe, Brooks, and Ehrenfeid [Jordan et al., 2011] found little change in process understanding in the Spotting the Weedy Invasives project and 24% in knowledge of invasive plants increased. Crall et al. [Crall et al., 2013] found no changes in science literacy according to test results, but they managed to catch some improvements in science literacy and knowledge using context-specific measures.
However, it is clear that motives, behavioral strategies, the distribution of time and effort, and other features will vary in different groups of participants. Rotman et al. [Rotman et al., 2012] consider that the motivation of participants will change during their activity in the project.
The most significant approach is the separation of volunteers according to the amount of work that they have done, or more broadly by behavioral strategies in projects. The participants may be divided into two categories [Ponciano et al., 2014]: transient and regular [Miller et al., 2019]. Some researchers use broad classification: active, less active, and passive participants. Crowston and Fagnot [Crowston, Fagnot2008] divided participants into massive virtual collaborations, into non-contributors, initial contributors, continuing contributors, and meta-contributors. Quantitative measurement of engagement is calculated on the basis of activity ratio, relative activity duration, daily devoted time, and variation in periodicity. It helps to reveal hardworking, spasmodic, persistent, lasting, and moderate participants [Ponciano, Brasileiro, 2015]. It is important that participating in the project with a high frequency rate and remaining a long time in the project are contradictory characteristics [Ponciano, Brasileiro, 2015, p. 258]. Aristeidou, Scanlon, and Sharples [Aristeidou et al., 2017] offer five engagement profiles: hardworking, persistent, loyal, lurking and visitors based on such factors as lack of time, website usability, fear, and quality of contributions, as well as reasons for joining, and feelings of belonging to the community taking into account the so-called 'lurking ratio' to the metrics and capturing different facets of psychological engagement (roles, motivation, attitude, satisfaction and belonging). Maslanov and Dolmatov [Maslanov, Dolmatov, 2019] take another specific motive — opposition to ideological discourse that may be presented in "official science".
Special attention is often paid to the potential of involving and retaining newcomers, volunteers who show little interest and involvement (micro individual contributions) or leave the project quickly [Nov et al., 2011]. Jackson, 0sterlund, Maidel, Crowston, and Mugar [Jackson et al., 2016] offer a special behavioral classification of beginners: casual workers
(light work), community workers (post comments early in their participation), focused workers (contributing to the science goals of the project) and dropouts. Eveleigh et al. [Eveleigh et al., 2014] note that the majority of project participants are those who contribute in small quantities — "dabblers" and it is therefore important to increase the attractiveness for those who come in "just to try".
On the other hand, the last statement will not be true for projects with a high level of complexity (which may be called heavyweight collaboration [Haythornthwaite, 2009]). Moreover, for such a complex project as the discovery of gravitational lenses — 1% approximately from 40 000 volunteers made 90% of the contributions, at the same time the skill factor was distributed more broadly than the contribution rate with 20% of agents accumulating 80% of the skill [Marshall et al., 2016]. One of the effects concerns the fact that data quality grows with the experience of volunteers [Jiguet, 2009]. Therefore, especially for projects requiring preparation, efforts should be made to retain experienced volunteers over time. It means the importance of informing them about the results of the project such as publications, conservation initiatives, management decisions, or policy actions [Chu et al., 2012; Land-Zandstra et al., 2016; Thiel et al., 2014]. Despite the large number of projects and volunteers involved, and the idea that the goal of most citizen science projects is advancing scientific understanding [Theobald et al., 2015, p. 240], it is not so easy to consider the obtained scientific results. Theobald et al point out that only 12% of the projects reviewed (in the field of biodiversity) have articles in peer-reviewed journals [Theobald et al., 2015, p. 240]. Cooper, Shirk, and Zuckerberg ascertained that more than half of central claims about the impacts of climate change on avian migration were based on the data from citizen scientists [Cooper et al, 2014, p. 2—3]. Follett and Strezov note that in some projects other forms of the result performance are used, such as reports or websites [Follett, Strezov, 2015]. Thiel et al. [Thiel et al., 2014] could discover 227 scientific publications devoted to marine research based on the volunteers' activity.
Recent studies suggest analyzing the behavior of volunteers not within the framework of a single project, but the general strategy of their behavior on multi-project citizen science platforms [Ponciano, Pereira, 2019]. In citizen science game projects, new interface and mechanical changes are also built on the greater role of players who can choose their level and cognitive load [Miller et al., 2019].
More accurate understanding of the motives and characteristics of the volunteers required for the project can contribute to the target tactics of attracting volunteers to projects, organizing communication, providing feedback, the corresponding project interface, etc. Many researchers emphasize the importance of the attractiveness and functionality of a site, application, etc. aimed at collecting data. A widely known technique is gamification. The gamification element can act as a set of points to help track and compare the individual efforts of volunteers, for example, as competitive 'values of birding [Sullivan et al., 2009]. However, there are citizen science projects made purely in the form of online multiplayer computer games that use dynamic and stylized graphical interfaces including awarding points for tasks, competition between participants, and performance ranking (e.g., Foldit, Phylo, Eyewire, and EteRNA). The greatest scientific success was achieved in the game Foldit (University of Washington) in the field of protein folding research [Bauer, Popovic, 2017]. Solving puzzles by non-professionals provides mapping retinal neurons from a microscope data bank generated by the Max Planck Institute for Medical Research in Eyewire game (Princeton University) and map genetic codes in Phylo (McGill University). It should be noted that game design covers quite complex content. A participant usually has to complete a number of tutorial levels to take part in the project.
Fig. 1. Strategies of citizen scientists involvement according to complexity of tasks
Thus, development of citizen science strategy should take into account characteristics of the project itself the most important of which is the complexity of the tasks assigned to the volunteers and participants' personal peculiarities (Fig. 1). The complexity of the tasks of the project is illustrated by increasing the saturation of the background in orange (from white color indicates the easiest task to bright orange associated with the most difficult ones). The size of the blue circles is the skill level of the participants. For more complex tasks, where experience can improve the quality of work, there is an urgent need to establish communication with citizen scientists. It is important to provide feedback. The advantage is interactive maps, data visualization, the ability to assess the scope of work and its effectiveness.
In some cases, gamification, the ability to earn points, receive statuses, etc. become a good solution. Attracting participants can go through channels that allow you to find people who are passionate about similar topics and potentially have initial information. The most complex projects, in which people of certain qualification can actually cope with their tasks, require the maximum inclusion of citizen scientists in the scientific project and provision of high-quality data illustrating scientific results.
Conclusion
Information and communication technologies have created new opportunities in the field of scientific creativity. Citizen science annually provides scientists in a wide variety of fields around the world with an enormous amount of data, but its use is still limited. Today, a scientist should take into account many factors if he wishes to create a citizen science project. Development of the project requires a special number of participants having
relevant qualification, the ratio of virtual and real work of the volunteer and other factors. In addition to the problem of citizen scientists' motivation, there is a problem of the quality of the information received. This data requires validation and verification so that it can be used in scientific research. To create an effective citizen science project, it is necessary to imply a combination of technical capabilities to improve the quality of the information. It is also necessary to use personal characteristics of participants and active interaction with volunteers, considering them not just as a data source, but an active resource.
References
Ahrends, A., Fanning, E., Rahbek, C., Burgess, N.D., Gereau, R.E., Marchant, R., Bulling, M.T., Lovett, J.C., Platts, P.J., Wilkins Kindemba, V., Owen, N. (2011). Funding Begets Biodiversity. Diversity and Distributions, 17, 191—200.
Aristeidou, M., Scanlon, E., Sharples, M. (2017). Profiles of Engagement in Online Communities of Citizen Science Participation. Computers in Human Behavior, 74, 246—256. DOI: 10.1016/J. CHB.2017.04.044.
Assis, J., Tavares, D., Tavares, J., Cunha, A., Alberto, F., Serrao, E. A. (2009). Findkelp, a GIS-Based Community Participation Project to Assess Portuguese Kelp Conservation Status. Journal of Coastal Research, 56 (II), 1469-1473.
Azzurro, E., Aguzzi, J., Maynou, F., Chiesa, J.J., Savini, D. (2013). Diel Rhythms in Shallow Mediterranean Rocky-reef Fishes: a Chronobiological Approach with the Help of Trained Volunteers. Journal of the Marine Biological Association of the United Kingdom, 93 (2), 461-470. DOI: 10.1017/ s0025315412001166.
Ballard, H.L., Trettevick, J.A., Collins, D. (2008). Comparing Participatory Ecological Research. In C. Wilmsen, W. Elmendorf, L. Fisher, J. Ross, B. Sararthy, G. Wells (Eds.), Two Contexts: an Immigrant Community and a Native American Community on Olympic Peninsula (pp. 187-215).
Ballard, Heidi L., Dixon, C.G.H., Harris, E.M. (2017). Youth-focused Citizen Science: Examining the Role of Environmental Science Learning and Agency for Conservation. Biological Conservation, 208, 65-75. DOI: 10.1016/J.BIOCON.2016.05.024.
Balmer, D., Gillings, S., Caffrey, B., Swann, B., Downie, I., Fuller, R. (2013). 2007-2011: The Breeding and Wintering Birds of Britain and Ireland. BTO Books.
Bauer, A., Popovic, Z. (2017). Collaborative Problem Solving in an Open-Ended Scientific Discovery Game. In Proceedings of the ACM on Human-Computer Interaction, 1 (CSCW), 1-21. DOI: 10.1145/3134657.
Bird, T.J., Bates, A.E., Lefcheck, J.S., Hill, N.A., Thomson, R.J., Edgar, G.J., Stuart-Smith, R.D., Wotherspoon, S., Krkosek, M., Stuart-Smith, J.F., Pecl, G.T., Barrett, N., Frusher, S. (2014). Statistical Solutions for Error and Bias in Global Citizen Science Datasets. Biological Conservation, 173, 144-154. DOI: 10.1016/j.biocon.2013.07.037.
Bonney, R. (1996). Citizen Science: A Lab Tradition. Living Bird, 15 (4), 7-15.
Bonney, R., Ballard, H., Jordan., H, McCallie, E., Phillips, T., Shirk, J., Wilderman, C. (2009). Public Participation in Scientific Research: Defining the Field and Assessing its Potential for Informal Science education. A CAISE Inquiry Group Report. Available at: https://www.informalscience.org/ sites/default/files/PublicParticipationinScientificResearch.pdf (date accessed: 04.03.2021).
Bonney, R., Cooper, C.B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K.V., Shirk, J. (2009). Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy. BioScience, 59(11), 977-984. DOI: 10.1525/bio.2009.59.11.9.
Bonter, D.N., Cooper, C.B. (2012). Data Validation in Citizen Science: a Case Study from Project FeederWatch. Frontiers in Ecology and the Environment, 10 (6), 305-307. DOI: 10.1890/110273.
Botts, E.A., Erasmus, B.F.N., Alexander, G.J. (2011). Geographic Sampling Bias in the South African Frog Atlas Project: Implications for Conservation planning. Biodiversity and Conservation, 20 (1), 119-139. DOI: 10.1007/s10531-010-9950-6.
Brooks, S.J., Fitch, B., Davy-Bowker, J., Codesal, S.A. (2019). Anglers' Riverfly Monitoring Initiative (ARMI): A UK-wide Citizen Science Project for Water Quality Assessment. Freshwater Science, 38 (2), 270-280. DOI: 10.1086/703397.
Brossard, D., Lewenstein, B., Bonney, R. (2005). Scientific Knowledge and Attitude Change: The Impact of a Citizen Science Project. International Journal of Science Education, 27 (9), 10991121. DOI: 10.1080/09500690500069483.
Carr, A.J.L. (2004). Why do We All Need Community Science. Society and Natural Resources, 17, 841-849.
Chen, L.-J., Ho, Y.-H., Lee, H.-C., Wu, H.-C., Liu, H.-M., Hsieh, H.-H., Huang, Y.-T., Lung, S.-C. C. (2017). An Open Framework for Participatory PM2.5 Monitoring in Smart Cities. IEEE Access, 5, 14441-14454. DOI: 10.1109/ACCESS.2017.2723919.
Chu, M., Leonard, P., Stevenson, F. (2012). Growing the Base for Citizen Science: Recruiting and Engaging Participants. In J. Dickinson, R. Bonney (Eds.), Citizen Science: Public Participation in Environmental Research (pp. 69-81.). Cornell University Press.
Cohn, J. P. (2008). Citizen Science: Can Volunteers Do Real Research? BioScience, 58 (3), 192197. DOI: 10.1641/B580303.
Conn, P.B., McClintock, B.T., Cameron, M.F., Johnson, D.S., Moreland, E.E., Boveng, P.L. (2013). Accommodating Species Identification Errors in Transect Surveys. Ecology, 94, 2607-2618. WileyEcological Society of America.
Cooper, C.B., Dickinson, J.L., Phillips, T., Bonney, R. (2008). Science Explicitly for Nonscientists. Ecology and Society, 13 (2), r1. Available at: https://www.ecologyandsociety.org/ vol13/iss2/resp1/ (date accessed: 24.02.2021).
Cooper, C.B. (2016). Citizen Science: How Ordinary People are Changing the Face of Discovery. The Overlook Press.
Cooper, C.B., Caren B., Shirk, J., Zuckerberg, B. (2014). The Invisible Prevalence of Citizen Science in Global Research: Migratory Birds and Climate Change. PLoS ONE, 9 (9), e106508. DOI: 10.1371/journal.pone.0106508.
Corburn, J. (2005). Street Science: Community Knowledge and Environmental Health Justice. The MIT Press.
Crall, A.W., Jordan, R., Holfelder, K., Newman, G.J., Graham, J., Waller, D.M. (2013). The Impacts of an Invasive Species Citizen Science Training Program on Participant Attitudes, Behavior, and Science Literacy. Public Understanding of Science (Bristol, England), 22 (6), 745-764. DOI: 10.1177/0963662511434894.
Crowston, K., Fagnot, I. (2008). The Motivational Arc of Massive Virtual Collaboration. Available at: https://crowston.syr.edu/content/motivational-arc-massive-virtual-collaboration (date accessed: 04.03.2021).
Cuff, D., Hansen, M., Kang, J. (2008). Urban Sensing. Communications of the ACM, 51 (3), 24-33. DOI: 10.1145/1325555.1325562.
Culver, C.S., Schroeter, S.C., Page, H.M., Dugan, J.E. (2010). Essential Fishery Information for Trap-Based Fisheries: Development of a Framework for Collaborative Data Collection. Marine and Coastal Fisheries, 2 (1), 98-114. DOI: 10.1577/C09-007.1.
Curtis, V. (2015). Motivation to Participate in an Online Citizen Science Game. Science Communication, 37(6), 723-746. DOI: 10.1177/1075547015609322.
Curtis, V. (2018). Online Citizen Science and the Widening of Academia: Distributed Engagement with Research and Knowledge Production. Springer. Available at: https://books.google.ru/books?id=u qZWDwAAQBAJ&dq=Grey,+F.+(2009).+Viewpoint:+The+age+of+citizen+cyberscience.+CER N+Courier.&hl=ru&source=gbs_navlinks_s (date accessed: 24.02.2021).
Delaney, D.G., Sperling, C.D., Adams, C.S., Leung, B. (2008). Marine Invasive Species: Validation of Citizen Science and Implications for National Monitoring Networks. Biological Invasions, 10 (1), 117-128. DOI: 10.1007/s10530-007-9114-0.
Dennis, R.L.H., Thomas, C.D. (2000). Bias in Butterfly Distribution Maps: The Influence of Hot Spots and Recorder's Home Range. Journal of Insect Conservation, 4 (2), 73-77. DOI: 10.1023/A:1009690919835.
Dickinson, J.L., Bonney, R., Fitzpatrick, J.W. (2012). Citizen Science: Public Participation in Environmental Research. Cornell University Press. Available at: https://books.google.ru/books?id=b DIyrXuS6ooC&hl=ru&source=gbs_navlinks_s (date accessed: 24.02.2021).
Dickinson, J.L., Zuckerberg, B., Bonter, D.N. (2010). Citizen Science as an Ecological Research Tool: Challenges and Benefits. Annual Review of Ecology, Evolution, and Systematics, 41 (1), 149-172. DOI: 10.1146/annurev-ecolsys-102209-144636.
Dudik, M., Phillips, S.J., Schapire, R.E. (2006). Correcting Sample Selection Bias in Maximum Entropy Density Estimation (pp. 323-330). Available at: https://papers.nips.cc/paper/2929-correcting-sample-selection-bias-in-maximum-entropy-density-estimation (date accessed: 24.02.2021).
Eastman, L., Hidalgo-Ruz, V., Macaya, V., Nunez, P., Thiel, M. (2014). The Potential for Young Citizen Scientist Projects: a Case Study of Chilean Schoolchildren Collecting Data on Marine Litter. Revista de Gestäo Costeira Integrada, 14 (4), 569-579. DOI: 10.5894/rgci507.
Edgar, G., Stuart-Smith, R. (2009). Ecological Effects of Marine Protected Areas on Rocky Reef Communities — a Continental-scale Analysis. Marine Ecology Progress Series, 388, 51-62. DOI: 10.3354/meps08149.
Edmondson, E., Lintott, C., Raddick, J., Schawinski, K., Wallin, J., Fortson, L.F., Masters, K., Nichol, R., Borne, K. (2012). Galaxy Zoo: Morphological Classification and Citizen Science. In M. Way, J. Scargle, K. Ali, A. Srivastava (Eds.), Advances in Machine Learning and Data Mining for Astronomy (pp. 213-236). CRC Press, Taylor & Francis Group.
Eitzel, M.V, Cappadonna, J.L., Santos-Lang, C., Duerr, R.E., Virapongse, A., West, S.E., Kyba, C.C.M., Bowser, A., Cooper, C.B., Sforzi, A., Metcalfe, A.N., Harris, E.S., Thiel, M., Haklay, M., Ponciano, L., Roche, J., Ceccaroni, L., Shilling, F. M., Dörler, D., ... Jiang, Q. (2017). Citizen Science Terminology Matters: Exploring Key Terms. In Citizen Science: Theory and Practice (Vol. 2, Iss. 1, pp. 1-20). DOI: 10.5334/cstp.96.
Ellis, M.V., Taylor, J.E. (2018). Effects ofWeather, Time of Day, and Survey Effort on Estimates of Species Richness in Temperate Woodlands. Emu: Austral Ornithology, 118 (2), 183-192. DOI: 10.1080/01584197.2017.1396188.
Elmore, K.L., Flamig, Z.L., Lakshmanan, V., Kaney, B.T., Farmer, V., Reeves, H.D., Rothfusz, L.P. (2014). MPING: Crowd-Sourcing Weather Reports for Research. Bulletin of the American Meteorological Society, 95 (9), 1335-1342. DOI: 10.1175/BAMS-D-13-00014.1.
Evans, S.M., Birchenough, A.C., Fletcher, H. (2000). The Value and Validity of Community-based Research: TBT Contamination of the North Sea. Marine Pollution Bulletin, 40 (3), 220-225. DOI: 10.1016/S0025-326X(99)00228-3.
Eveleigh, A., Jennett, C., Blandford, A., Brohan, P., Cox, A.L. (2014). Designing for Dabblers and Deterring Drop-outs in Citizen Science. Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI'14), 2985-2994. DOI: 10.1145/2556288.2557262.
Fehri, R., Khlifi, S., Vanclooster, M. (2020). Testing a Citizen Science Water Monitoring Approach in Tunisia. Environmental Science & Policy, 104, 67-72. DOI: 10.1016/J.ENVSCI.2019.11.009.
Fischer, F. (2000). Citizens, Experts, and the Environment: the Politics of Local Knowledge. Duke University Press.
Fithian, W., Elith, J., Hastie, T., Keith, D.A. (2015). Bias Correction in Species Distribution Models: Pooling Survey and Collection Data for Multiple Species. Methods in Ecology and Evolution, 6 (4), 424-438. DOI: 10.1111/2041-210X.12242.
Follett, R., Strezov, V. (2015). An Analysis of Citizen Science Based Research: Usage and Publication Patterns. PLOS ONE, 10 (11), e0143687. DOI: 10.1371/journal.pone.0143687.
Foster-Smith, J., Evans, S.M. (2003). The Value of Marine Ecological Data Collected by Volunteers. Biological Conservation, 113, 199-213.
Gashkova, E.M., Berezovskaya, I.P., Shipunova, O.D. (2018). Models of Self-Identification in Digital Communication Environments. The European Proceedings of Social and Behavioural Sciences, 35, 374-382.
Geldmann, J., Heilmann-Clausen, J., Holm, T.E., Levinsky, I., Markussen, B., Olsen, K., Rahbek, C., Tottrup, A.P. (2016). What Determines Spatial Bias in Citizen Science? Exploring Four Recording Schemes with Different Proficiency Requirements. Diversity and Distributions, 22 (11), 1139-1149. DOI: 10.1111/ddi.12477.
Geoghegan, H., Dyke, A., Pateman, R., West, S., Everett, G. (2016). Understanding Motivations for Citizen Science. Available at: http://www.ukeof.org.uk/resources/citizen-science-resources/Mot ivationsforCSREPORTFINALMay2016.pdf (date accessed: 24.02.2021).
Giraud, C., Calenge, C., Coron, C., Julliard, R. (2016). Capitalizing on Opportunistic Data for Monitoring Relative Abundances of Species. Biometrics, 72(2), 649-658. DOI: 10.1111/biom.12431.
Goffredo, S., Pensa, F., Neri, P., Orlandi, A., Gagliardi, M. S., Velardi, A., Piccinetti, C., Zaccanti, F. (2010). Unite Research with What Citizens Do for Fun: "Recreational Monitoring" of Marine Biodiversity. Ecological Applications, 20 (8), 2170-2187. DOI: 10.1890/09-1546.1.
Goodchild, M.F. (2007). Citizens as Sensors: the World of Volunteered Geography. GeoJournal, 69 (4), 211-221. DOI: 10.1007/s10708-007-9111-y.
Gowan, C., Ruby, M., Knisley, R., Grimmer, L. (2007). Stream Monitoring Methods Suitable for Citizen Volunteers Working in the Coastal Plain and Lower Piedmont Regions of Virginia. American Entomologist, 53 (1), 48-57. DOI: 10.1093/ae/53.1.48.
Grant, S., Berkes, F. (2007). Fisher Knowledge as Expert System: A Case from the Longline Fishery of Grenada, the Eastern Caribbean. Fisheries Research, 84 (2), 162-170.
Gu, W., Swihart, R.K. (2004). Absent or Undetected? Effects of Non-detection of Species Occurrence on Wildlife-habitat Models. Biological Conservation, 116 (2), 195-203. DOI: 10.1016/ S0006-3207(03)00190-3.
Haklay, M. (2013). Citizen Science and Volunteered Geographic Information: Overview and Typology of Participation. In D. Sui, S. Elwood, M. Goodchild (Eds.), Crowdsourcing Geographic Knowledge (pp. 105-122). Springer Netherlands. DOI: 10.1007/978-94-007-4587-2_7.
Haythornthwaite, C. (2009). Crowds and Communities: Light and Heavyweight Models of Peer Production. 2009 42nd Hawaii International Conference on System Sciences, 1-10. DOI: 10.1109/ HICSS.2009.137.
Heilmann-Clausen, J., Lsssoe, T. (2012). On Species Richness Estimates, Climate Change and Host Shifts in Wood-Inhabiting Fungi. Fungal Ecology, 5 (5), 641-646. DOI: 10.1016/J. FUNECO.2011.10.003.
Henderson, P.A. (2003). Practical Methods in Ecology. Blackwell Pub. Available at: https:// books.google.ru/books?id=OLmixdNrxVcC&dq=Henderson+Practical+Methods+in+Ecology&lr =&hl=ru&source=gbs_navlinks_s (date accessed: 24.02.2021).
Hidalgo-Ruz, V., Thiel, M. (2013). Distribution and Abundance of Small Plastic Debris on Beaches in the SE Pacific (Chile): A Study Supported by a Citizen Science Project. Marine Environmental Research, 87-88, 12-18. DOI: 10.1016/j.marenvres.2013.02.015.
Hijmans, R.J., Garrett, K.A., Huaman, Z., Zhang, D.P., Schreuder, M., Bonierbale, M. (2000). Assessing the Geographic Representativeness of Genebank Collections: the Case of Bolivian Wild Potatoes. Conservation Biology. The Journal of the Society for Conservation Biology, 14 (6), 17551765.
Hill, M.O. (2012). Local Frequency as a Key to Interpreting Species Occurrence Data When Recording Effort Is Not Known. Methods in Ecology and Evolution, 3 (1), 195-205. DOI: 10.1111/j.2041-210X.2011.00146.x.
Hornsten, L., Fredman, P. (2000). On the Distance to Recreational Forests in Sweden. Landscape and Urban Planning, 51, 1-10.
Irwin, A. (1995). Citizen Science. London: Routledge.
Jackson, C., 0sterlund, C., Maidel, V., Crowston, K., Mugar, G. (2016). Which Way Did They Go? Newcomer Movement through the Zooniverse. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (CSCW '16), 623-634. DOI: 10.1145/2818048.2835197.
Jiguet, F. (2009). Method Learning Caused a First-time Observer Effect in a Newly Started Breeding Bird Survey. Bird Study, 56 (2), 253-258. DOI: 10.1080/00063650902791991.
Johnston, A., Hochachka, W., Strimas-Mackey, M., Gutierrez, V. R., Robinson, O., Miller, E., ... Fink, D. (2019). Best Practices for Making Reliable Inferences from Citizen Science Data: Case Study Using eBird to Estimate Species Distributions. BioRxiv, 574392. DOI: 10.1101/574392.
Jones, M.G., Childers, G., Andre, T., Corin, E.N., Hite, R. (2018). Citizen Scientists and Noncitizen Scientist Hobbyists: Motivation, Benefits, and Influences. International Journal of Science Education, PartB, 8(4), 287-306. DOI: 10.1080/21548455.2018.1475780.
Jordan, R.C., Gray, S.A., Howe, D.V., Brooks, W.R., Ehrenfeid, J.G. (2011). Knowledge Gain and Behavioral Change in Citizen-Science Programs. Conservation Biology, 25(6), 1148-1154. DOI: 10.1111/j.1523-1739.2011.01745.x.
Jordan Raddick, M., Bracey, G., Gay, P.L., Lintott, C.J., Cardamone, C., Murray, P., Schawinski, K., Szalay, A.S., Vandenberg, J. (2013). Galaxy Zoo: Motivations of Citizen Scientists. Astronomy Education Review, 12 (1). DOI: 10.3847/AER2011021.
Kadmon, R., Farber, O., Danin, A. (2004). Effect of Roadside Bias on the Accuracy of Predictive Maps Produced by Bioclimatic Models. Ecological Applications, 14 (2), 401-413.
Kamp, J., Oppel, S., Heldbjerg, H., Nyegaard, T., Donald, P.F. (2016). Unstructured Citizen Science Data Fail to Detect Long-term Population Declines of Common Birds in Denmark. Diversity and Distributions, 22 (10), 1024-1035. DOI: 10.1111/ddi.12463.
Kelling, S., Fink, D., La Sorte, F. A., Johnston, A., Bruns, N.E., Hochachka, W.M. (2015). Taking a 'Big Data' Approach to Data Quality in a Citizen Science Project. Ambio, 44(S4), 601-611. DOI: 10.1007/s13280-015-0710-4.
Kelling, S., Johnston, A., Bonn, A., Fink, D., Ruiz-Gutierrez, V., Bonney, R., Fernandez, M., Hochachka, W.M., Julliard, R., Kraemer, R., Guralnick, R. (2019). Using Semistructured Surveys to Improve Citizen Science Data for Monitoring Biodiversity. BioScience, 69 (3), 170-179. DOI: 10.1093/biosci/biz010.
Kery, M., Schmid, H., Zbinden, N. (2006). Trend Analyses From Chance Observations of Birds in Switzerland: Correction for Effort and Random-effects Models for Combined Analyses Across Species. Journal of Ornithology, 147 (Suppl.), 123.
Kery, M., Royle, J.A., Schmid, H., Schab, M., Volet, B., Hafliger, G., Zbinden, N. (2010). Site-Occupancy Distribution Modeling to Correct Population-Trend Estimates Derived from Opportunistic Observations. Conservation Biology, 24 (5), 1388-1397. DOI: 10.1111/j. 1523-1739.2010.01479.x.
Klimenko, N., Tyakht, A., Popenko, A., Vasiliev, A., Altukhov, I., Ischenko, D., Shashkova, T., Efimova, D., Nikogosov, D., Osipenko, D., Musienko, S., Selezneva, K., Baranova, A., Kurilshikov, A., Toshchakov, S., Korzhenkov, A., Samarov, N., Shevchenko, M., Tepliuk, A., Alexeev, D. (2018). Microbiome Responses to an Uncontrolled Short-Term Diet Intervention in the Frame of the Citizen Science Project. Nutrients, 10 (5), 576. DOI: 10.3390/nu10050576.
Kung, R. (2018). SpaghettiLens: A Software Stack for Modeling Gravitational Lenses by Citizen Scientists. Astronomy and Computing, 23, 115-123. DOI: 10.1016/J.ASTOM.2018.02.007.
Kung, R., Saha, P., Ferreras, I., Baeten, E., Coles, J., Cornen, C., Macmillan, C., Marshall, P., More, A., Oswald, L., Verma, A., Wilcox, J.K. (2018). Models of Gravitational Lens Candidates from Space Warps CFHTLS. Monthly Notices of the Royal Astronomical Society, 474 (3), 3700-3713. DOI: 10.1093/mnras/stx3012.
Land-Zandstra, A.M., Devilee, J.L.A., Snik, F., Buurmeijer, F., van den Broek, J.M. (2016). Citizen Science on a Smartphone: Participants' Motivations and Learning. Public Understanding of Science, 25 (1), 45-60. DOI: 10.1177/0963662515602406.
Latimer, A.M., Wu, S., Gelfand, A.E., Silander Jr., J. A. (2006). Building Statistical Models to Analyze Species Distributions. Ecological Applications, 16 (1), 33-50. DOI: 10.1890/04-0609.
Lawler, J.L., O'Connor, R. R. (2004). How Well Do Consistently Monitoried Breeding Bird Survey Routes Represent the Environments of the Conterminous United States? The Condor, 106, 801-814.
Le Fur, J., Guilavogui, A., Teitelbaum A. (2011). Contribution of Local Fishermen to Improving Knowledge of the Marine Ecosystem ans Resources in the Republic of Guinea, West Africa. Canadian Journal of Fisheries and Aquqtic Sciences, 68 (8), 1454-1469. DOI: 10.1139/f2011-061.
Lowry, C.S., Fienen, M.N., Hall, D.M., Stepenuck, K.F. (2019). Growing Pains ofCrowdsourced Stream Stage Monitoring Using Mobile Phones: The Development of CrowdHydrology. Frontiers in Earth Science, 7. DOI: 10.3389/feart.2019.00128.
Mac Aodha, O., Gibb, R., Barlow, K.E., Browning, E., Firman, M., Freeman, R., Harder, B., Kinsey, L., Mead, G. R., Newson, S. E., Pandourski, I., Parsons, S., Russ, J., Szodoray-Paradi, A., Szodoray-Paradi, F., Tilova, E., Girolami, M., Brostow, G., Jones, K. E. (2018). Bat Detective-Deep Learning Tools for Bat Acoustic Signal Detection. PLoS Computational Biology, 14 (3), e1005995. DOI: 10.1371/journal.pcbi.1005995.
MacKenzie, D.I. (2018). Occupancy Estimation and Modeling: Inferring Patterns and Dynamics of Species Occurrence. Academic Press, an imprint of Elsevier.
MacKerron, G., Mourato, S. (2013). Happiness is Greater in Natural Environments. Global Environmental Change, 23 (5), 992-1000. DOI: 10.1016/J.GLOENVCHA.2013.03.010.
MacLeod, C.D., Bannon, S.M., Pierce, G.J., Schweder, C., Learmonth, J.A., Herman, J.S., Reid, R. J. (2005). Climate Change and the Cetacean Community of North-West Scotland. Biological Conservation, 124 (4), 477-483. DOI: 10.1016/j.biocon.2005.02.004.
Mahajan, S., Kumar, P., Pinto, J. A., Riccetti, A., Schaaf, K., Camprodon, G., Smari, V., Passani, A., Forino, G. (2020). A Citizen Science Approach for Enhancing Public Understanding of Air Pollution. Sustainable Cities and Society, 52, 101800. DOI: 10.1016/J.SCS.2019.101800.
Mair, L., Ruete, A. (2016). Explaining Spatial Variation in the Recording Effort of Citizen Science Data across Multiple Taxa. PLOS ONE, 11 (1), e0147796. DOI: 10.1371/journal.pone.0147796.
Maisonneuve, N., Stevens, M., Niessen, M.E., Steels, L. (2009). NoiseTube: Measuring and Mapping Noise Pollution with Mobile Phones. Information Technologies in Environmental Engineering, Proceedings of the 4th International ICSC Symposium, ITEE 2009, 215-228. DOI: 10.1007/978-3-540-88351-7_16.
Maisonneuve, N., Stevens, M., Ochab, B. (2010). Participatory Noise Pollution Monitoring Using Mobile Phones. Information Polity, 15(1, 2), 51-71. DOI: 10.3233/IP-2010-0200
Makhnach, A.V., Laktionova, A.I., Postylyakova, Y.V. (2019). Citizen Science in Socio-psychological Research. Institute of Psychology of the Russian Academy of Sciences. Social and Economic Psychology, 4 (16), 43-70.
Mankowski, T.A., Slater, S.J., Slater, T.F. (2011). An Interpretive Study Of Meanings Citizen Scientists Make When Participating In Galaxy Zoo. Contemporary Issues in Education Research (CIER), 4 (4), 25. DOI: 10.19030/cier.v4i4.4165.
Marshall, P.J., Verma, A., More, A., Davis, C.P., More, S., Kapadia, A., Parrish, M., Snyder, C., Wilcox, J., Baeten, E., Macmillan, C., Cornen, C., Baumer, M., Simpson, E., Lintott, C.J., Miller, D., Paget, E., Simpson, R., Smith, A.M., ... Collett, T.E. (2016). Space Warps — I. Crowdsourcing the Discovery of Gravitational Lenses. Monthly Notices of the Royal Astronomical Society, 455 (2), 1171-1190. DOI: 10.1093/mnras/stv2009.
Martin, V., Smith, L., Bowling, A., Christidis, L., Lloyd, D., Pecl, G. (2016). Citizens as Scientists: What Influences Public Contributions to Marine Research? Science Communication, 38 (4), 495-522. DOI: 10.1177/1075547016656191.
Maslanov, E.V., Dolmatov, A.V. (2019). Citizen Science — Science as a Vocation. Epistemology & Philosophy of Science, 56 (3), 40-44. DOI: 10.5840/eps201956345.
Miller, D.A.W., Pacifici, K., Sanderlin, J.S., Reich, B.J. (2019). The Recent Past and Promising Future for Data Integration Methods to Estimate Species' Distributions. Methods in Ecology and Evolution, 10 (1), 22-37. DOI: 10.1111/2041-210X.13110.
Miller, J.A., Narayan, U., Hantsbarger, M., Cooper, S., El-Nasr, M.S. (2019). Expertise and Engagement: Re-Designing Citizen Science Games With Players' Minds in Mind. Proceedings of the 14th International Conference on the Foundations of Digital Games (FDG '19), 1-11. DOI: 10.1145/3337722.3337735.
Minkler, M., Wallerstein, N. (Eds.). (2011). Communitybased Participatory Research for Health: From Process to Outcomes. John Wiley & Sons.
Monk, J., Jerodiaconou, D., Bellgrove, A., Laureson, L. (2008). Using Community-based Monitoring with GIS to Create Habitat Maps for a Marine Protected Area in Australia. Journal of the Marine Biological Association of the United Kingdom, 88, 865-871.
Nov, O., Arazy, O., Anderson, D. (2011). Dusting for Science: Motivation and Participation of Digital Citizen Science Volunteers. In Conference: iConference 2011, Inspiration, Integrity, and Intrepidity, Seattle, Washington, USA, February 8-11, 2011, 68-74. DOI: 10.1145/1940761.1940771.
Oliveira, C.V., Olmos, F., dos Santos-Filho, M., Bernardo, C.S.S. (2018). Observation of Diurnal Soaring Raptors In Northeastern Brazil Depends On Weather Conditions and Time of Day. The Journal of Raptor Research, 52 (1), 56-65.
Osborn, D.A., Pearse, J.S., Roe, C.A. (2005). Monitoring Rocky Intertidal Shorelines: a Role for the Public in Resource Management. California and the World Ocean '02, Conference Proceedings, 624-636.
Ottinger, G. (2010). Buckets of Resistance: Standards and the Effectiveness of Citizen Science. Science, Technology, & Human Values, 35(2), 244-270. DOI: 10.1177/0162243909337121.
Parrish, J.K., Burgess, H., Weltzin, J.F., Fortson, L., Wiggins, A., Simmons, B. (2018). Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design. Integrative and Comparative Biology, 1-11. DOI: 10.1093/icb/icy032.
Peltier, H., Dabin, W., Daniel, P., Canneyt, O. Van, Doremus, G., Huon, M., Ridoux, V. (2012). The Significance of Stranding Data as Indicators of Cetacean Populations at Sea: Ecological Indicators, 18, 278-290.
Peterson, A.T., Navarro-Siguenza, A.G., Benitez-Diaz, H. (2008). The Need for Continued Scientific Collecting; a Geographic Analysis of Mexican Bird Specimens. Ibis, 140 (2), 288-294. DOI: 10.1111/j.1474-919X.1998.tb04391.x.
Ponciano, L., Brasileiro, F. (2015). Finding Volunteers' Engagement Profiles in Human Computation for Citizen Science Projects. Human Computation, 1 (2), 245-264. DOI: 10.15346/ hc.v1i2.12.
Ponciano, L., Brasileiro, F., Simpson, R., Smith, A. (2014). Volunteers' Engagement in Human Computation for Astronomy Projects. Computing in Science & Engineering, 16 (6), 52-59. DOI: 10.1109/MCSE.2014.4.
Ponciano, L., Pereira, T.E. (2019). Characterising Volunteers' Task Execution Patterns Across Projects on Multi-project Citizen Science Platforms. Proceedings of the 18th Brazilian Symposium on Human Factors in Computing Systems (IHC '19), 1-11. DOI: 10.1145/3357155.3358441.
Rachlin, Y., Negi, R., Khosla, P.K. (2011). The Sensing Capacity of Sensor Networks. IEEE Transactions on Information Theory, 57(3), 1675-1691. DOI: 10.1109/TIT.2010.2103733.
Rambonnet, L., Vink, S.C., Land-Zandstra, A.M., Bosker, T. (2019). Making Citizen Science Count: Best Practices and Challenges of Citizen Science Projects on Plastics in Aquatic Environments. Marine Pollution Bulletin, 145, 271-277. DOI: 10.1016/J.MARPOLBUL.2019.05.056.
Reddy, S., Davalos, L.M. (2003). Geographical Sampling Bias and Its Implications for Conservation Priorities in Africa. Journal of Biogeography, 30 (11), 1719-1727. DOI: 10.1046/j.1365-2699.2003.00946.x.
Reed, J., Raddick, M.J., Lardner, A., Carney, K. (2013). An Exploratory Factor Analysis of Motivations for Participating in Zooniverse, a Collection of Virtual Citizen Science Projects. 2013 46th Hawaii International Conference on System Sciences, 610-619. DOI: 10.1109/HICSS.2013.85.
Reeves, N.T., Simperl, E. (2019). Efficient, but Effective? Proceedings of the ACM on HumanComputer Interaction, 3 (CSCW), 1-35. DOI: 10.1145/3359279.
Rotman, D., Preece, J., Hammock, J., Procita, K., Hansen, D., Parr, C., Lewis, D., Jacobs,
D. (2012). Dynamic Changes in Motivation in Collaborative Citizen-science Projects. Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work (CSCW'12), 217. DOI: 10.1145/2145204.2145238.
Royle, J.A., Chandler, R.B., Yackulic, C., Nichols, J.D. (2012). Likelihood Analysis of Species Occurrence Probability from Presence-only Data for Modelling Species Distributions. Methods in Ecology and Evolution, 3 (3), 545-554. DOI: 10.1111/j.2041-210X.2011.00182.x.
Shannon, M.A., Antypas, A.R. (1996). Civic Science is Democracy in Action. Northwest Sci, 70 (1), 66-69.
Shevchenko, S.Y. (2018). Citizen Science: are People Distinguishable from Bacteria? Epistemology & Philosophy of Science, 55 (1), 171-183. DOI: 10.5840/eps201855115.
Shirk, J.L., Ballard, H.L., Wilderman, C.C., Phillips, T., Wiggins, A., Jordan, R., McCallie,
E., Minarchek, M., Lewenstein, B.V., Krasny, M.E., Bonney, R. (2012). Public Participation in Scientific Research: a Framework for Deliberate Design. Ecology and Society, 17 (2), art29. DOI: 10.5751/ES-04705-170229.
Silvertown, J. (2009). A New Dawn for Citizen Science. Trends in Ecology & Evolution, 24 (9), 467-471. DOI: 10.1016/J.TREE.2009.03.017.
Sirbu, A., Becker, M., Caminiti, S., De Baets, B., Elen, B., Francis, L., Gravino, P., Hotho, A., Ingarra, S., Loreto, V., Molino, A., Mueller, J., Peters, J., Ricchiuti, F., Saracino, F., Servedio, V.D.P., Stumme, G., Theunis, J., Tria, F., Van den Bossche, J. (2015). Participatory Patterns in an International Air Quality Monitoring Initiative. PloS One, 10 (8), e0136763. DOI: 10.1371/journal. pone.0136763.
Stevens, M., Vitos, M., Altenbuchner, J., Conquest, G., Lewis, J., Haklay, M. (2014). Taking Participatory Citizen Science to Extremes. IEEE Pervasive Computing, 13 (2), 20-29. DOI: 10.1109/ MPRV.2014.37.
Sullivan, B.L., Wood, C.L., Iliff, M.J., Bonney, R.E., Fink, D., Kelling, S. (2009). eBird: A Citizen-based Bird Observation Network in the Biological Sciences. Biological Conservation, 142 (10), 2282-2292. DOI: 10.1016/j.biocon.2009.05.006.
Theobald, E.J., Ettinger, A.K., Burgess, H.K., DeBey, L.B., Schmidt, N.R., Froehlich, H.E., Wagner, C., HilleRisLambers, J., Tewksbury, J., Harsch, M.A., Parrish, J.K. (2015). Global Change and Local Solutions: Tapping the Unrealized Potential of Citizen Science for Biodiversity Research. Biological Conservation, 181, 236-244. DOI: 10.1016/j.biocon.2014.10.021.
Thiel, M., PennaDiaz, M., LunaJorquera, G., Salas, S., Sellanes, J., Stotz, W. (2014). Citizen Scientists and Marine Research: Volunteer Participants, Their Contributions, and Projection for the Future. Oceanography and Marine Biology, 52, 257-314. DOI: 10.1201/b17143-6.
Tiufiakov, N., Dahanayake, A., Zudilova, T. (2018). Data Provenance in Citizen Science Databases. In C.S. Sidlo, P.Z. Revesz, T. Cerquitelli, B. Thalheim, A. Benczur, T. Horvath (Eds.), Communications in Computer and Information Science (vol. 909, pp. 242-253). Springer Verlag. DOI: 10.1007/978-3-030-00063-9_23.
Toomey, A.H., Domroese, M.C. (2013). Can Citizen Science Lead to Positive Conservation Attitudes and Behaviors? In Human Ecology Review (Vol. 20, pp. 50-62). Society for Human Ecology.
Troudet, J., Grandcolas, P., Blin, A., Vignes-Lebbe, R., Legendre, F. (2017). Taxonomic Bias in Biodiversity Data and Societal Preferences. Scientific Reports, 7(1), 9132. DOI: 10.1038/s41598-017-09084-6.
Tulloch, A I.T., Possingham, H.P., Joseph, L.N., Szabo, J., Martin, T.G. (2013). Realising the Full Potential of Citizen Science Monitoring Programs. Biological Conservation, 165, 128-138.
Tulloch, A.I.T., Szabo, J.K. (2012). A Behavioural Ecology Approach to Understand Volunteer Surveying for Citizen Science Datasets. Emu— Austral Ornithology, 112 (4), 313-325. DOI: 10.1071/ MU12009.
Tweddle, J.C., Robinson, L.D., Pocock, M.J.O., Roy, H.E. (2012). Guide to Citizen Science: Developing, Implementing and Evaluating Citizen Science to Study Biodiversity and the Environment in the UK. Natural History Museum and NERC Centre for Ecology & Hydrology for UK-EOF.
van Strien, A.J., Termaat, T., Groenendijk, D., Mensing, V., Kery, M. (2010). Site-occupancy Models May Offer New Opportunities for Dragonfly Monitoring Based on Daily Species Lists. Basic and Applied Ecology, 11 (6), 495-503. DOI: 10.1016/j.baae.2010.05.003.
Wessels, P., Moran, N., Johnston, A., Wang, W. (2019). Hybrid Expert Ensembles for Identifying Unreliable Data in Citizen Science. Engineering Applications of Artificial Intelligence, 81, 200-212. DOI: 10.1016/J.ENGAPPAI.2019.01.004.
Wiggins, A., Crowston, K. (2011). From Conservation to Crowdsourcing: A Typology of Citizen Science. In 201144th Hawaii International Conference on System Sciences (pp. 1-100). DOI: 10.1109/ HICSS.2011.207.
Wiggins, A., He, Y. (2016). Community-based Data Validation Practices in Citizen Science. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing-CSCW'16, 1546-1557. DOI: 10.1145/2818048.2820063.
Wiggins, A., Newman, G., Stevenson, R.D., Crowston, K. (2011). Mechanisms for Data Quality and Validation. In E-Science 38 Workshops (EScienceW). IEEE Seventh International Conference (pp. 14-19).
Wilderman, C.C. (2007). Models of Community Science: Design Lessons from the Field. In C. McEver, R. Bonney, J. Dickinson, S. Kelling, K. Rosenberg, J.L. Shirk (Eds.), Citizen Science Toolkit Conference. Cornell Laboratory of Ornithology.
Гражданская наука: понятие, проблемы и перспективы
дарья Сергеевна Быльева
Санкт-Петербургский политехнический университет Петра Великого (СПбПУ),
Санкт-Петербург, Россия; e-mail: [email protected]
Виктория Валерьевна Лобатюк
Санкт-Петербургский политехнический университет Петра Великого (СПбПУ),
Санкт-Петербург, Россия; e-mail: [email protected]
Анна Владимировна Рубцова
Санкт-Петербургский политехнический университет Петра Великого (СПбПУ),
Санкт-Петербург, Россия; e-mail: [email protected]
Благодаря информационным и коммуникационным технологиям гражданская наука сегодня становится мощным инструментом во всем мире. Это позволяет ученым получать данные беспрецедентного масштаба. Однако, несмотря на огромный потенциал и довольно долгую историю использования, все еще существуют серьезные проблемы, которые препятствуют получению научно достоверных данных из-за недостаточной квалификации граждан и си-
стематических ошибок выборки при исследованиях биоразнообразия. В статье представлены методы преодоления этих ограничений с целью реализации проектов, требующих высококвалифицированных участников. Принципы построения мотивационной модели и коммуникационной политики для участников гражданских научных проектов должны быть адекватны научным задачам, их масштабам и сложности.
Ключевые слова: гражданская наука, мотивация исследования, гражданские ученые, участие общественности, вовлеченность в науку.
Благодарность
Исследование выполнено при финансовой поддержке Российского фонда фундаментальных исследований (РФФИ) в рамках научного проекта № 19-111-50614.