Science 2.0 study

Updates on progress and discussions on results of Science 2.0: implications for European policies on research and innovation study

Archive for the month “May, 2012”

Open Access is not a luxury, it is a must-have for EU

Have a look and comment on  the Commissioner Neelie Kroes speech opening the PEER2012 conference.

The EC is working on including data sharing as a requirement for EU-funded project enlarging and on Recommendations for Member States on improving access, management and preservation of scientific results.

Advertisements

Openness in the Research Cycle

We’re looking for a model that enables us to describe the changes in the research process brought by Science.2.0. First, we have proposed division between open science, citizen science and data-intensive science.

Now, we have focused on the research cycle trying to capture different applications on different stages of research process. The inner cycle on the diagram below represents stages of the research process from the conceptualisation to the publication of a peer-reviewed article.

In the science.2.0 model the openness, principles of sharing and collaboration are (can be) present on every stage of the research process whereas in the traditional model, only result that is shared is the peer-reviewed article (often behind a paywall).

At the conceptualisation stage open discussions around ideas (blogs, fora) and knowledge sharing is important (open annotation, open bibliographies). Subsequently we have the stage of gathering data where data and research praxis can be shared in real-time (open data, open lab notebooks) and gathered in collaboration with citizens. In order to deposit data to enable further analysis we need eInfrastructures. Also in many instances the data can be analysed with the help of volunteers (citizen science) and open collaboration (collaborative analysis) . The analysis of data can be facilitated by sharing the open software. The outcome of analysis can be published as an article or a book chapter (which can be updated in an instance – liquid publications) but also as a statement accompanied with metadata that is linked with other statements (nanopublications). The article can be published in an open access journal or submitted to an institutional repository allowing wider accessibility. Data can be published  and linked to the article. Finally, publications are subject of the review by the academic community to establish the importance of the findings and filter the increasing number of scientific literature according to their relevance and significance for the field. Publications can be opened to post-peer reviews when the community openly discusses the importance of the discovery. Also other reputation systems, distinct from peer-review can be used to measure scientific excellence and author/publication impact (e.g. altmetrics).

What’s missing in our diagram?  What should be added/changed in order to better capture the Science2.0 phenomenon?

New ways to evaluate scientists

The main bottleneck to the adoption of science 2.0 paradigm is the lack of recognition. The career of the scientists is determined by papers, articles and citations, and there is no recognition for releasing data, code, or laboratory notebook. Moreover, effective reputation management tools can have a key role in finding the right micro-expertise to involve in large scale collaborative efforts.

As GrrlScientist puts it:

If there is no way to ensure that scientists get credit for their ideas and intellectual contributions, then they will not contribute to the Open Science movement. Traditionally, the way that credit has been assigned to scientists has been through publication of their data in peer-reviewed journals and by citing their colleagues’ work in their papers.

Michal Nielsen recognizes this as well in his book.

In our paper, we point out to the possibility of creating new ways of managing reputation, such as the Open Source example of IBM.

In our study, we’re trying to look for actual implementation of reputation management for scientist. So far we’ve come across:

PeerEvaluation, a service which helps scientists share their data and papers and thereby measure their reputation

Altmetrics, a service which maps the reputation of scientists by monitoring how people use their papers on CiteUlike, Menedeley, Zotero

This is very much related and overlapping with alternative ways to do peer-review, such as F1000.

However, these services remain highly experimental and there is little data about how they are used. Do you have any evidence of uptake and impact of alternative ways to evaluate scientists?

Open text, data and code

Just came across Ten Brighter Ideas (HT Jon Udell)

The tool behind it allows you to see the assumptions and calculations behind the recommendations for environment-friendly behavior.

It’s a case of:

– open text, where you are able to explore the rationale behind each statement

– open data, where you can see the data behind the calculation

– open code, as you can directly act on the code and modify the calculations

On top of it, it has a great design.

Our reference list is on Mendeley

Our reference list is on Mendeley

We have created a group on Mendeley to share the references we’re collecting during the desk research. See, join and add papers to our group

Is there a Yammer for scientist?

Cost of coordination are the main barrier to increased collaboration in science.

What are the best, most innovative software tools for enabling collaboration between scientists?

For example, is there a yammer for scientist?

Scientific evidence that gets better the more scientists use it

Just as for web services and collaborative public services, data sharing allows for post-scarcity quality gains the more people use it.

Each researchers’ data will get better the more other researches use them.

The analysis will get better as well.

As David (2011) puts it:

data-sets are not subject to being “over-grazed”, but instead are likely to be enriched and rendered more accurate the more that researchers are allowed to comb through them.

Post Navigation