Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Does Size Matter? The Multipolar International Landscape of Nanoscience

Abstract

How do different countries tackle nanoscience research? Are all countries similar except for a trivial size effect, as science is often assumed to be universal? Or does size dictate large differences, as large countries are able to develop activities in all directions of research, while small countries have to specialize in some specific niches? Alternatively, is size irrelevant, as all countries have followed different historical paths, leading to different patterns of specialisation? Here, we develop an original method that uses a bottom-up definition of scientific subfields to map the international structure of any scientific field. Our analysis shows that nanoscience research does not show a universal pattern of specialisation, homothetic of that of a single global leader (e.g., the United States). Instead, we find a multipolar world, with four main ways of doing nanosciences.

Introduction

A basic (and generally implicit) assumption of science policies is that countries should focus on those fields in which they can be more competitive, for whatever reason. This assumption is probably inspired on the idea of comparative advantages through (economic) specialisation, that was initially conceived in trade theory [1,2]. Therefore, except for a few large countries (particularly the United States), which can be active in all fields of knowledge, most countries may show specialisation in specific areas and this specialisation will be coherent with the degree of development [3,4].

On the empirical side, there have been many studies of the international scientific production, with different focuses. Among the topics addressed, one finds the competition between different regions of the world [57] or the emergence of China as new scientific power [810]. Several papers have studied how different countries specialize in different areas of science [1114]. Most of these studies divide science in a ‘top-down’ way, by using pre-defined fields such as the Journal Subject Categories (JSC) of the Web of Science. Countries specialisations are determined by comparing the country production in each field to the world average, leading to the well-known “Revealed comparative advantage” (RCA) index introduced by Béla Balassa [15] and widely used in economics to study the relative efforts of countries in different domains, such as exports of different products.

Here, we study the international landscape of a specific field: nanoscience. This area represents a high priority for many countries, which have devoted huge amounts of funding to promote research [16,17]. There is an abundant literature studying nanoscience publications. Methodological articles have dealt with the proper way to define nanosciences, in order to obtain relevant databases [1822]. Many papers have focused in specific subfields (ZnO nanostructures [23]; nano-energy [24]. Some have addressed important features of this new field, such as its interdisciplinarity [25], its relation to technological innovation [26,27] or its progressive institutionalization [28]. The international structure of nanoscience research has also received considerable attention. Most articles deal with specific geographical regions: Europe [29]; South Africa [30]; Australia [31]; Brazil [32]; China [33]. Islam and Miyazaki (2010) have studied the worldwide landscape based on nanotechnology-related academic publications from Elsevier Engineering Index Compendex database [34]. They define a priori (top-down) subfields and study the relative specializations of several regions of the world. They conclude that the “US leads exceptionally in biotechnology sector”, while the EU countries favor nanomaterials and Asian countries “show their strong research performances in nanoelectronics”.

The main originality of the present study lies in the description of the international landscape of nanoscience through a bottom-up partition of the field based on single articles. As pointed out by Rafols et al [35], the advantage of these “local” maps is that they can be “more accurate in their description of the relations within a field” than maps obtained through top-down categories. We will show that this bottom-up approach is crucial to obtain a faithful description of countries’ specializations. Thanks to advances in methodology and computer power, there have been recently many articles using bottom-up methods to study scientific domains [3641]. However, none has dealt with the description of the international landscape of nanosciences.

In this paper, we first show that the single dimension of the country 'size' is not sufficient to characterize in a meaningful way countries’ specializations in nanoscience. Then, we build a multidimensional landscape (hereafter 'nanoscape'), using the relevant subfields of nanoscience, to obtain a detailed map of countries’ specializations. We find a multipolar world of nanoscience research, structured around four main poles: the first gathers rich countries with ancient research traditions, the second and third group so-called ‘emergent’ countries—both with a rapid scientific and economic growth but focused on different topics, and the fourth is mostly composed by the former Eastern European communist countries, with strong research traditions concentrated in some specific fields.

Brief Description of the Method

A detailed description of our method is given in the S1, S2 and S3 Appendices. In short, we have used the well-tested Arora et al. [18] query to gather the nanoscience records from Web of Science over three years (2010–2012, 340350 records obtained). Table 1 shows the number of publications for each country.

thumbnail
Table 1. Essential size statistics for the intensity of nanoscience research among countries.

https://doi.org/10.1371/journal.pone.0166914.t001

To identify the relevant subfields for research in nanosciences, we use a ‘bottom-up’ strategy that creates groups of articles that share many references and therefore are close in cognitive space. We hitherto distinguish ‘disciplines’, which are predefined by the Web of Science through JSC and ‘subfields’, obtained by our bottom-up method. In practice, we create a network using the records as ‘nodes’ and their number of common references as links. On this network, we use the Louvain algorithm [42] to maximize modularity and identify the 36 relevant subfields for research in nanosciences. Each record belongs to a single subfield. This approach, detailed in the S1 Appendix, is well-known in scientometrics under the label ‘bibliographic coupling’ and has been shown to lead to meaningful subfields [43,44]. The main subfields are listed in Table 2, and a detailed description of all of them is given in the S1 Dataset.

thumbnail
Table 2. Main nanoscience subfields (more than 5,000 articles).

https://doi.org/10.1371/journal.pone.0166914.t002

Then, we compute the proportion of articles for each country in each cluster (S2 Dataset). This corresponds to the ‘effort’ or ‘output’ that each country devotes to each subfield of nanoscience. By normalizing by the corresponding world ‘effort’, one recovers the well-known “Revealed comparative advantage” (RCA) index. Finally, we perform a Principal Component Analysis (PCA) using the FactoMineR package [45] to find the most meaningful correlations among countries’ RCAs. To interpret the PCA results, we add variables characterizing the countries’ socioeconomic characteristics, such as GDP or the rate of scientific growth.

Results

Does size matter?

As a first step, we analyze the international distribution of nanoscience articles (Table 1). It is clear that the country scientific ‘size’ (i.e. its total number of publications, first column) does not determine the intensity of nanoscience research, given by the domestic share (last column). For example, the United States is by far the leader in science (world share of 24%) but not in nanoscience, dominated by China, which publishes more than one out of five of all nanoscience articles, well above its 10% science share. More generally, Table 1 shows a clear difference between most Asian countries (China, South Korea, Taiwan…) that have a domestic share of nanoscience articles above the world share, while many European countries have a much lower share (UK, Italy, Netherlands…). But, again, this geographic difference is not related to a size effect. In next section, we produce a richer description of the scientific production of each country, to reveal which are what the important dimensions that determine its position in the nanoscape.

The multipolar nanoscape obtained by the multidimensional landscape

To go beyond this simple size analysis, we compute a partition of the nanoscience field into relevant subfields using our ‘bottom-up’ strategy that creates groups of articles that share many references and therefore are close in cognitive space. Table 2 shows the main nanoscience subfields found by our method (with more than 5000 articles).

The next step is to map the distribution of the articles of each country over the 36 subfields (see the S1 Dataset for the whole table). Then, Principal Component Analysis (PCA, see S1 Appendix), allows to find the most significant dimensions that characterize the nanoscape, i.e. the international landscape of nanoscience research (Fig 1a and 1c). Intuitively, the PCA components represent the combinations of subfields (Fig 1b) that retain most of the information present in all the data, while reducing the number of dimensions. In our case, PCA finds three significant components that explain 56% of the variance present in all the subfields. PCA takes advantage of correlations such as: “Very often, countries that have a high share in the TiO2MAT cluster also have a high share in ZnOwirestMAT” to infer a similarity between those two subfields and the corresponding countries, and display them in the same region of Fig 1a and 1b (we only discuss the two most important dimensions of the nanoscape, see the S1 Appendix for more details). The position of the arrows in Fig 1b and 1c arises from the position of the countries in the nanoscape and the corresponding values of their subfields shares or socio-economic characteristics. For example, countries in the upper-right quadrant of Fig 1a have high shares in “opticsMAT” or “drugBIO” (Fig 1b) and a substantial percentage of highly cited articles (Top10 arrow in Fig 1c).

thumbnail
Fig 1.

(a) First two axis of the PCA analysis that determine the ‘nanoscape’. First two axis of the PCA analysis that determines the ‘nanoscape’, the position of countries according to their profiles in nanoscience research. Colors correspond to OECD membership (black: founding member; blue: present member; red: non member); (b) Representation of the most significant (cos2 higher than 0.1) subfields in the first two axis of the nanoscape. Representation of the 20 most relevant subfields, ie those with the highest projections (square cosine) along the two first axis. Arrows point towards the countries (Fig 1a) that have high shares of the corresponding subfields. For example, OECD countries have a high share of “proteinBIO” articles (right side on both Figs 1a and 1b), while emergent countries have a high share in “batteryCHEM” (top left in both figures); (c) Additional variables in the nanoscape. Socio-economic and scientific variables. These are not used to compute the nanoscape, but are projected on the PCA axis to help interpreting the results [43]. As in Fig 1b, arrows point towards the countries (Fig 1a) that have high values for the corresponding variable. Only the 32 most significant variables are shown: circuits; EastEur; emergent; general; nanoart, OCDE, RD.GDP, scientists, Top10; GDP. (BIOCHEM, Biochemistry Molecular Biology); (BIOPHY, Biophysics); (BIOTEC, Biotechnology Applied Microbiology); (CELLBIO, Cell Biology); (CHEM, Chemistry); (COMP, Computer Science); (CRYSTAL, Crystallography); (ELECHEM, Electro-chemistry); (ENERG, Energy Fuels); (ENGI, Engineering); (ENVI, Environmental Sciences Ecology); (IMAGMED, Radiology Nuclear Medicine Medical Imaging); (INSTRUM, Instruments Instrumentation); (MATSCI, Materials Science); (MECH, Mechanics); (METAL, Metallurgy Metallurgical Engineering); (PHARMA, Pharmacology Pharmacy); (PHYS, Physics); (POLYM, Polymer Science); (SPECTRO, Spectroscopy); (THERMO, Thermodynamics); (TOXIC, Toxicology). See details in S1 Appendix.

https://doi.org/10.1371/journal.pone.0166914.g001

The main results of our analysis can be summarized as follows. In terms of subfields (Fig 1b), the first dimension opposes subfields related to cellular biology or biochemistry (such as proteinBIO or drugBIO, right side) to subfields related to materials science such as TiO2MAT or thermoMAT (left side). The second dimension opposes traditional subfields related to physics or metallurgy such as metalMAT or magnetPHYS to more interdisciplinary subfields such as fibersBIO. From the socio-economic point of view (Fig 1a), the first dimension opposes rich countries (right) to less-developed countries (left), while the second dimension opposes East-european countries (bottom) to countries that are emerging in the scientific arena (top).

This analysis is fully confirmed by the position of the additional variables (Fig 1c). We find on the right-hand side countries with higher GDP, investment in Research and Development (‘RD.GDP’), higher proportion of scientists in the population (‘scientists’) and higher share of Top cited articles (‘Top10’). These countries also have larger shares of countries’ total publications (not only those in nanosciences) in cellular biology (‘CELLBIO’), biochemistry (‘BIOCHEM’) and biophysics (‘BIOPHY’). On the contrary, countries located in the left-hand side of Fig 1a are ‘emergent’, i.e. have increased rapidly their number of scientific articles in the last 20 years. They have larger shares of total publications in polymer science (‘POLYM’), engineering (‘ENGI’) or materials science (‘MATSCI’). The second axis opposes countries located in the lower side, that publish many articles in the disciplines of metallurgy (‘METAL’) or physics (‘PHYS’), to countries located in the upper side, which have a high share of articles in fields such as environment (‘ENVI’) or toxicology (‘TOXIC’).

To further interpret the nanoscape, it is interesting to create groups of similar countries (details given in S1 Appendix). A standard k-means algorithm allows to create, in an objective way, four groups of countries that are close in the nanoscape. These groups confirm to a great extent the previous categorization. The first cluster gathers mostly OECD countries: 78% of them are OECD founding members, compared to 19% in the other clusters (p-value < 0.001). A second cluster essentially groups former communist countries from Eastern Europe: they represent 60% of the countries of this cluster, to be compared to 10% for the other clusters (p-value = 0.014). The k-means algorithm introduces a distinction between two types of emergent countries: one specialised in the production of electronics devices (lead by South Korea, China and Malaysia) and a second, more specialised on chemical and physical standard methods of material synthesis, lead by Iran and South Africa. This distinction corresponds to the information contained in the third dimension of the PCA, which is taken into account in the clustering analysis.

Features that do not appear in the nanoscape are also interesting. The total number of articles published does not appear in Fig 1c, confirming the absence of ‘size’ as a relevant variable. For example, China and Bulgaria have very different sizes but they are close in the nanospace. Conversely, Ukraine, Pakistan and Thailand have all published about 20000 articles, but they have completely different shares in the different subfields and therefore different positions in the nanoscape. One could also wonder why there aren’t countries with a high domestic share of nanoscience articles and also a high share of biochemistry or cellular biology (opposing ‘nanoart’ and ‘BIOCHEM’ arrows in Fig 1c). A tentative explanation is the inertia of the scientific communities. When countries have a well-structured and ancient scientific traditions, which is needed to build biology communities, it is difficult to reorient 15% of the scientists into a new field in a few years. Instead, if the countries’ scientific communities are young, it is easier to develop new fields through central financing agencies.

We end by emphasizing the importance of building the subfields bottom-up to achieve a meaningful representation of the different scientific domains. In such a multidisciplinary field, most subfields mix various disciplines, as confirmed by their fragmented composition in terms of Journal Subjects Categories (S3 Dataset). In general, five JSCs are present at significant levels (more than 10% of the articles), and the most important JSC rarely reaches 50%. This means that JSC as “Materials Science, Multidisciplinary”, “Nanoscience & Nanotechnology”, “Chemistry, Multidisciplinary” or “Physics, Applied” are too wide to characterize precise subfields within nanoscience. Instead, our bottom-up approach captures important (but subtle for the outsider) differences between subfields. Take for example the two subfields related to “Quantum dots”. As can be seen through the most cited references and keywords, the first subfield (labeled “QDotsMAT”) mainly deals with luminescent semiconductor quantum dots, prepared in solvents and covalently coupled to biomolecules, for use in biological imaging and detection. Instead, “QDotsPHYS” prepares quantum dots by molecular beam epitaxy, and uses them for fundamental physics problems, such as spintronics, quantum coherence and quantum computing. This scientific difference is correlated to strong contrasts in the countries’ specializations. Emergent countries focus on the first subfield, while members of the OECD specialize in the second, as shown by the countries shares (S2 Dataset) and summarized by the arrows for these subfields in Fig 1b. A similar contrast is found for “graphenePHYS” and “grapheneMAT” (S3 Appendix).

Discussion: A Multipolar World

We have presented a method that, by using a bottom-up definition of scientific subfields, is able to map the international structure of any scientific field, while remaining faithful to the specificities of the field. Our method improves on the too generic description of scientific fields in terms of standard disciplines, such as the ‘Journal Scientific Categories’ from Web of Science (see the S3 Appendix for a full discussion of this point). In the present application to nanoscience, we have shown that the country size does not contain much information about its position in the nanoscape. Instead of a universal pattern of specialisation, homothetic of that of a single global leader (the US), we find a multipolar world with four distinctive profiles.

There are several reasons that explain these four (main) different ways of tackling nanosciences. The most important is that countries approach emerging fields starting from their specific position in the general scientific landscape, which signals their specific strengths. This is particularly clear for an interdisciplinary field such as nanoscience, which can be entered from a variety of disciplinary angles. In practice, nanoscience means something different for (East-European) countries with a strong background in physics or metallurgy or for (OECD) countries with strong biomedical research.

In this aspect, our study connects to (and updates) previous mappings of science as a whole [12,13,46,47]. According to Glanzel (2001), four basic paradigmatic patterns in publication profiles could be distinguished at that time: The “western model” with clinical medicine and biomedical research as dominating fields; the former socialist countries with “excessive activity” in chemistry and physics; the ‘bio-environmental model’ with biology and earth and space sciences in the main focus; finally, the ‘Japanese model’ with engineering and chemistry being predominant. A similar study was carried out recently [14] and found some evolutions of this pattern. They proposed three distinct types: “well-developed” countries with a strong specialisation in biomedical disciplines, a group of former “iron-curtain” countries with many publications in physics, chemistry and engineering and finally a group of “less-developed” countries with a strong record in “agricultural” subjects. Our work confirms the importance of the first two regions (“well-developed” and “former iron-curtain”) and shows how their specific strengths explain their approach to nanosciences. However, the two last groups from Glanzel (2001) and the last from [14] are not relevant for nanosciences. Instead, we have shown the importance of a group of emergent countries, focusing on engineering and chemistry (as Japan used to do), that were hardly visible in 2001 but that are now among the most important in the world.

Clearly, the scientific landscape is in continuous evolution, and the photograph we present here is likely to change in a few years. These evolutions may preserve the overall landscape (i.e., the meaning of the two first dimensions), but countries will probably shift positions. Or new scientific dimensions may emerge as more significant, dramatically changing the nanoscape. In both cases, future work could combine quantitative and qualitative research to investigate the origins of these poles and their evolutions. We can list a few candidates: the specific scientific traditions of each country or region; the impact of science and technology policies; the weight of knowledge-based industries… For example, nanosciences have been, for more than 15 years, a priority for the policies of OECD countries [16]. Industrial research has not the same impact in all the countries, as some of them have strong and ancient scientific systems but have been traditionally weak in industrializing scientific knowledge.

Our findings shed new light on the ‘center-periphery’ relationships [48,49]. It is well-known that some ‘developing countries’ are now becoming global leaders—as China and India—or very active in scientific research—as most South Asian countries and Brazil [8,50]. In addition, our map shows that the emergence of these new centers (such as China) also implies the correlative rearrangement of new peripheries, within the frame of a more complex worldwide division of scientific work.

Finally, this method could be used to investigate the international landscape in other fields. Several factors may affect international specialisation: The presence of a big instrument (such as an accelerator, an observatory [51], the availability of some specific resource (such as tropical species) [52], links to nationally strong industries for applications.

Some features of the nanoscape are likely to be specific, especially the rapid growth of emergent countries. The reason is that nanoscience seems to be a field with a relatively low entry cost, as compared to biochemistry or cellular biology. For example, there exist several inexpensive technologies (such as nanoimprinting lithography, see [53] that allow to develop some subfields. Our approach, which allows to build a micro description relevant for studying the macro level, could help understanding in a more general way the relative contribution of these different factors to specialisation profiles in different fields.

Supporting Information

S3 Appendix. Why using bottom-up subfields instead of the simpler Journal Subject Categories from ISI?

https://doi.org/10.1371/journal.pone.0166914.s003

(DOCX)

Author Contributions

  1. Conceptualization: LL PK PJ.
  2. Data curation: LL PK PJ.
  3. Formal analysis: LL PK PJ.
  4. Investigation: LL PK PJ.
  5. Methodology: PJ.
  6. Project administration: LL PK PJ.
  7. Resources: LL PK PJ.
  8. Software: LL PJ.
  9. Supervision: LL PK PJ.
  10. Validation: LL PK PJ.
  11. Visualization: LL PK PJ.
  12. Writing – original draft: LL PK PJ.
  13. Writing – review & editing: LL PK PJ.

References

  1. 1. Costinot A, Donaldson D, Komunjer I. What Goods Do Countries Trade? A Quantitative Exploration of Ricardo’s Ideas. Rev Econ Stud [Internet]. 2012;79(2):581–608. Available from: http://restud.oxfordjournals.org/lookup/doi/10.1093/restud/rdr033
  2. 2. Ricardo D. On the Principles of Political Economy and Taxation. London: John Murray; 1817.
  3. 3. Salomon JJ, Lebeau A. L’écrivain public et l'ordinateur: mirages du developpement. Paris: Hachette; 1988. 350 p.
  4. 4. Salomon J-J. Modern science and technology. In: Salomon J-J, Sagasti FR, Sachs-Jeantet C, editors. The Uncertain Quest: science, technology, and development. Tokyo -New York—Paris: The United Nations University; 1994.
  5. 5. King D a. The scientific impact of nations: what different countries get for their research spending. Nature. 2004;430:311–6. pmid:15254529
  6. 6. May RM. The scientific wealth of nations. Science (80-). 1997;275:793–6.
  7. 7. Shelton RD, Holdridge GM. The US-EU race for leadership of science and technology: Qualitative and quantitative indicators. Scientometrics. 2004;60(3):353–63.
  8. 8. Jin B, Rousseau R. Evaluation of Research Performance and Scientometric Indicators in China. In: Moed HF, Glänzel W, Schmoch U, editors. Handbook of Quantitative Science and Technology Research. Dordrecht: Kluwer Academic Publishers; 2004. p. 497–514.
  9. 9. Leydesdorff L, Zhou P. Are the contributions of China and Korea upsetting the world system of science? Scientometrics. 2005 Jun;63(3):617–30.
  10. 10. Leydesdorff L. Evaluation of research and evolution of science indicators. Curr Sci. 2005 Nov;89(9):1510–7.
  11. 11. Dore JC, Ojasoo T, Okubo Y, Durand T, Dudognon G, Miquel JF. Correspondence factor analysis of the publication patterns of 48 countries over the period 1981–1992. J Am Soc Inf Sci. 1996;47(8):588–602.
  12. 12. Glänzel W. National characteristics in international scientific co-authorship relations. Scientometrics. 2001;51(1):69–115.
  13. 13. Miquel JF, Ojasoo T, Okubo Y, Paul A, Dore JC. World science in 18 disciplinary areas. Comparative evaluation of the publication patterns of 48 countries over the period 1981–1992. Scientometrics. 1995;33(2):149–67.
  14. 14. Moya-Anegon F, Herrero-Solana V. Worldwide Topology of the Scientific Subject Profile: A Macro Approach in the Country Level. PLoS One. 2013;8(12).
  15. 15. Balassa B. Trade Liberalisation and Revealed Comparative Advantage. Manchester Sch Econ Soc Stud. 1965;33(2):99–123.
  16. 16. E.U. Emerging Science and Technology priorities in public research policies in the EU, the US and Japan. Luxembourg; 2006.
  17. 17. PCAST. Report to the President and the Congress on the Third Assessment of the National Nanotechnology Initiative [Internet]. Washington DC; 2014. https://www.whitehouse.gov/sites/default/files/microsites/ostp/PCAST/pcast_fifth_nni_review_oct2014_final.pdf
  18. 18. Arora SK, Porter AL, Youtie J, Shapira P. Capturing new developments in an emerging technology: an updated search strategy for identifying nanotechnology research outputs. Scientometrics [Internet]. 2013;95(1):351–70. Available from: http://link.springer.com/10.1007/s11192-012-0903-6
  19. 19. Bassecoulard E, Lelu A, Zitt M. Mapping nanosciences by citation flows: A preliminary analysis. Scientometrics. 2007 Mar;70(3):859–80.
  20. 20. Huang C, Notten A, Rasters N. Nanoscience and technology publications and patents: a review of social science studies and search strategies. J Technol Transf. 2011;36(2):145–72.
  21. 21. Mogoutov A, Kahane B. Data search strategy for science and technology emergence: A scalable and evolutionary query for nanotechnology tracking. Res Policy. 2007 Jul;36(6):893–903.
  22. 22. Porter A, Youtie J, Shapira P, Schoeneck D. Refining search terms for nanotechnology. J Nanoparticle Res [Internet]. 2008;10(5):715–28. Available from: http://dx.doi.org/10.1007/s11051-007-9266-y
  23. 23. Avila-Robinson A, Miyazaki K. Evolutionary paths of change of emerging nanotechnological innovation systems: the case of ZnO nanostructures. Scientometrics. 2013 Jun;95(3):829–49.
  24. 24. Liu N, Guan J. Dynamic evolution of collaborative networks: evidence from nano-energy research in China. Scientometrics. 2015 Mar;102(3):1895–919.
  25. 25. Porter AL, Youtie J. How interdisciplinary is nanotechnology? J Nanoparticle Res [Internet]. 2009 Jul 6;11(5):1023–41. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2988207/
  26. 26. Cunningham S, Porter A. Bibliometric discovery of innovation and commercialization pathways in nanotechnology. In: Proceedings of the Portland International Conference on Management of Engineering and Technology. Portland; 2011.
  27. 27. Miyazaki K, Islam N. Nanotechnology systems of innovation—An analysis of industry and academia research activities. TECHNOVATION. 2007 Nov;27(11):661–75.
  28. 28. Schummer J. The global institutionalization of nanotechnology research: A bibliometric approach to the assessment of science policy. Scientometrics. 2007;70(3):669–92.
  29. 29. Ovalle-Perandones M-A, Gorraiz J, Wieland M, Gumpenberger C, Olmeda-Gomez C. The influence of European Framework Programmes on scientific collaboration in nanotechnology. Scientometrics. 2013 Oct;97(1):59–74.
  30. 30. Pouris A. Nanoscale research in South Africa: A mapping exercise based on scientometrics. Scientometrics. 2007 Mar;70(3):541–53.
  31. 31. Gorjiara T, Baldock C. Nanoscience and nanotechnology research publications: a comparison between Australia and the rest of the world. Scientometrics. 2014 Jul;100(1):121–48.
  32. 32. da S Sant’Anna L, de Menezes Alencar MS, Ferreira AP. Nanomaterials patenting in Brazil: some considerations for the national regulatory framework. Scientometrics. 2014 Sep;100(3):675–86.
  33. 33. Zhao Y-L, Song Y-L, Song W-G, Liang W, Jiang X-Y, Tang Z-Y, et al. Progress of nanoscience in China. Front Phys. 2014 Jun;9(3):257–88.
  34. 34. Islam N, Miyazaki K. An Empirical Analysis of Nanotechnology Research Domains. Technovation. 2010;30(4):229–37.
  35. 35. Rafols I, Leydesdorff L. Science Overlay Maps: A NewTool for Research Policy and Library Management. J oh Am Soc Inf Sci Tech. 2010;61(9):1871–87.
  36. 36. Bergstron C, Rosvall M. Maps of random walks on complex networks reveal community structure. 2008;105(4):1118–23.
  37. 37. Börner K. Atlas of Science: Visualizing What We Know. MIT Press; 2010.
  38. 38. Grauwin S, Beslon G, Fleury É, Franceschelli S, Robardet C, Rouquier J-B, et al. Complex systems science: Dreams of universality, interdisciplinarity reality. J Am Soc Inf Sci Technol [Internet]. 2012;63(7):1327–38. Available from: http://dx.doi.org/10.1002/asi.22644
  39. 39. Klavans R, Boyack KW. Toward a consensus map of science. J Am Soc Inf Sci Technol. 2009;60(3):455–76.
  40. 40. Rafols I, Meyer M. Diversity and network coherence indicator pf interdisciplinarity. Case studies in bioscience. Scientometrics2. 2010;82(2):263–87.
  41. 41. Small H. Visualizing science by citation mapping. J Am Soc Inf Sci. 1999;50(9):799–813.
  42. 42. Blondel V, Guillaume J-L, Lambiotte R, Lefebvre E. Fast unfolding of communities in large networks. J Stat Mech Theory Exp [Internet]. 2008;2008(10):P10008. Available from: http://stacks.iop.org/1742-5468/2008/i=10/a=P10008
  43. 43. Grauwin S, Jensen P. Mapping scientific institutions. Scientometrics. 2011;89(3):943–54.
  44. 44. Kessler MM. Bibliographic coupling between scientific papers. Am Doc [Internet]. 1963;14(1):10–25. Available from: http://dx.doi.org/10.1002/asi.5090140103
  45. 45. Le S, Josse J, Husson F. FactoMineR: An R package for multivariate analysis. J Stat Softw. 2008 Mar;25(1):1–18.
  46. 46. Campbell D, Lefebvre C, Picard-Aitken M, Côté G, Ventimiglia A, Roberge G, et al. Country and Regional Scientific Production Profiles. Luxembourg; 2013.
  47. 47. Viola P, Bruno N. International Science & Technology Specialisation: Where does Europe stand? Brussels; 2010.
  48. 48. Kreimer P. Délocalisation des savoirs en Amérique latine: le rôle des réseaux scientifiques. Pouvoirs Locaux. 2012;III:25.
  49. 49. Macleod R. Nature and Empire: Science and the Colonial Enterprise. Vol. 15, Osiris. Chiago: University of Chicago Press; 2001.
  50. 50. de Almeida ECE Guimarães JA. Brazil’s growing production of scientific articles—how are we doing with review articles and other qualitative indicators? Scientometrics. 2013;97(2):287–315.
  51. 51. Joerges B, Shinn T. A fresh look at instrumentation: an introduction. In: Instrumentation between science, state and industry, Sociology of the sciences yearbook. Dordrecht: Kluuwer Academic publishers; 2001.
  52. 52. Kreimer P, Zabala JP. Chagas Disease in Argentina: Reciprocal Construction of Social and Scientific Problems. Sci Technol Soc. 2007;(12):49–72.
  53. 53. Tong WM, Hector SD, Jung GY, Wu W, Ellenson J, Kramer K, et al. Nanoimprint lithography: the path toward high tech, low cost devices. In: Mackay R, editor. Emerging Lithographic Technologies IX, Pts 1 and 2. Bellingham: Society of Photo-Optical Instrumentation Engineers; 2005. p. 46–55. (PROCEEDINGS OF THE SOCIETY OF PHOTO-OPTICAL INSTRUMENTATION ENGINEERS (SPIE); vol. 5751).