Tag Archives: accuracy

Google Analytics statistics for SciCast, as of May 22, 2015.

SciCast Final Report Released

The final SciCast annual report has been released!  See the “About” or “Project Data” menus above, or go directly to the SciCast Final Report download page.

Exeutive Summary (excerpts)

Registration and Activity

SciCast has seen over 11,000 registrations, and over 129,000 forecasts. Google Analytics reports over 76K unique IP addresses (suggesting 8 per registered user), and 1.3M pageviews. The average session duration was 5 minutes.

Continue reading

3+

Users who have LIKED this post:

  • avatar
Color_Logo

SciCast Final Report (Public)

The SciCast 2015 Annual Report has been approved for public release. The report focuses on Y4 activities, but also includes a complete publication and presentation list for all four years.  Please click “Download SciCast Final Report”  to get the PDF.  You may also be interested in the SciCast anonymized dataset.

Here are two paragraphs from the Executive Summary:

We report on the fourth and final year of a large project at George Mason University developing and testing combinatorial prediction markets for aggregating expertise. For the first two years, we developed and ran the DAGGRE project on geopolitical forecasting. On May 26, 2013, renamed ourselves SciCast, engaged Inkling Markets to redesign our website front-end and handle both outreach and question management, re-engineered the system architecture and refactored key methods to scale up by 10x – 100x, engaged Tuuyi to develop a recommender service to guide people through the large number of questions, and pursued several engineering and algorithm improvements including smaller and faster asset data structures, backup approximate inference, and an arc-pricing model and dynamic junction-tree recompilation that allowed users to create their own arcs. Inkling built a crowdsourced question writing platform called Spark. The SciCast public site (scicast.org) launched on November 30, 2013, and began substantial recruiting in early January, 2014.

As of May 22, 2015, SciCast has published 1,275 valid questions and created 494 links among 655 questions. Of these, 624 questions are open now, of which 344 are linked (see Figure 1). SciCast has an average Brier score of 0.267 overall (0.240 on binary questions), beating the uniform distribution 85% of the time, by about 48%. It is also 18-23% more accurate than the available baseline: an unweighted average of its own “Safe Mode” estimates, even though those estimates are informed by the market. It beats that ULinOP about 7/10 times.

You are welcome to cite this annual report.  Please also cite our Collective Intelligence 2014 paper and/or our International Journal of Forecasting 2015 paper (if it gets published — under review now).

Sincerely,

Charles Twardy and the SciCast team

1+

Users who have LIKED this post:

  • avatar

So Long, and Thanks for All the Fish!

SciCasters:

Thank you for your participation over the past year and a half in the largest collaborative S&T forecasting project, ever. Our main IARPA funding has ended, and we were not able to finalize things with our (likely) new sponsor in time to keep the question-management, user support, engineering support, and prizes running uninterrupted. Therefore we will be suspending SciCast Predict for the summer, starting June 12, 2015 at 4 pm ET.  We expect to resume in the Fall with the enthusiastic support of a big S&T sponsor. In the meantime, we will continue to update the blog, and provide links to leaderboard snapshots and important data.

Recap

Through the course of this project, we’ve seen nearly 130,000 forecasts from thousands of forecasters on over 1,200 forecasting questions, and an average of >240 forecasts per day. We created a combinatorial engine robust enough to allow crowdsourced linking, resulting in the following rich domain structure:

Near-final questoin structure on SciCast, with most of the live links provided by users.

Near-final question structure on SciCast, with most of the live links provided by users. (Click for full size)

Some project highlights:

  • The market beat its own unweighted opinion pool (from Safe Mode) 7/10 times, by an average of 18% (measured by mean daily Brier score on a question)
  • The overall market Brier was about 0.29
  • The project was featured in The Wall Street Journal and Nature and many other places
  • SciCast partnered with AAAS, IEEE, and the FUSE program to author more than 1,200 questions
  • Project principals Charles Twardy and Robin Hanson answered questions in a Reddit Science AMA
  • SciCasters weighed in on news movers & shakers like the Philae landing and Flight MH370
  • SciCast held partner webinars with ACS and with TechCast Global
  • SciCast hosted questions (and provided commentary) for the Dicty World Race
  • In collaboration The Discovery Analytics Center at Virginia Tech and Healthmap.org, SciCast featured questions about the 2014-2015 flu season
  • SciCast gave away BIG prizes for accuracy and combo edits
  • Other researchers are using SciCast for analysis and research in the Bitcoin block size debate
  • MIT and ANU researchers studied SciCast accuracy and efficiency, and were unable to improve using stock machine learning — a testimony to our most active forecasters and their bots. [See links for Della Penna, Adjodah, and Pentland 2015, here.]

What’s Next?

Prizes for the combo edits contest will be sent out this week, and we will be sharing a blog post summarizing the project. Although SciCast.org will be closed, this blog will remain open as well as the user group.  Watch for announcements regarding future SciCast.

Once again, thank you so much for your participation!  We’re nothing without our crowd.

Contact

Please contact us at contact@scicast.org if you have questions about the research project or want to talk about using SciCast in your own organization.

So Long, and Thanks for All the Fish

3+

Users who have LIKED this post:

  • avatar

SciCast Accuracy Incentives Contest Has Ended

The SciCast Accuracy Incentives contest has come to an end. The final accuracy leaderboard will be selected at a random time in the next few weeks. This will allow more questions to resolve as well as randomize the selection time to reduce the impact of attempts to inflate scores on unresolved contest questions.

Read more about the contest.

0

Additional Questions Added to Accuracy Contest

On February 12, 2015 we are adding additional questions to the final round for the accuracy contest. Forecasts on these questions – through March 6, 2015 – have their market scores calculated and added to a person’s “portfolio.” The best portfolios at a time shortly after March 7, 2015, will win big prizes.

Continue reading

2+

Users who have LIKED this post:

  • avatar
  • avatar

Accuracy contest: February questions are active!

Can you be the most accurate forecaster on SciCast? Look for the questions marked with a gold Au symbol or select the “Prize Eligible” topics when searching questions. Forecasts on these questions through March 6th will have their market scores calculated and added to a person’s “portfolio.” The best portfolios at a time shortly after March 7, 2015, will win big prizes.

Continue reading

0

SciCast Accuracy Calculations

We have started adding “Accuracy” numbers to emails. For example:

  • Your average accuracy on this question was 83.
  • SciCast’s average accuracy on this question was 90.

What does that mean?  The short answer is that it’s a transform of the familiar Brier score, which we have mentioned in several blog posts.  Where the Brier measures your error (low is good), Accuracy measures your success (high is good). This is more intuitive … except when it’s not.

Continue reading

2+

Users who have LIKED this post:

  • avatar

Accuracy Contest: Second round questions

The second round of questions has been selected for the new accuracy contest. Forecasts on these questions from December 7, 2014, through January 6, 2015, have their market scores calculated and added to a person’s “portfolio.” The best portfolios at a time shortly after March 7, 2015, will win big prizes.

Continue reading

0