(Difference between revisions)
| || |
|-|Support for community applications: |+|
| || |
|-|! !! ETL import !! Pentaho reports !! Jaspersoft reports |+|
|-|| Drupal || MySQL import || || [ http:// drupal. org/ project/query_export Query Export for Jasper] |+|
|-|| Bugzilla || MySQL import || [http://sourceforge. net/projects/qareports/ Bugzilla analytics] || [http:/ /jasperforge. org/projects/bugzillareportswitholap Bugzilla reports with OLAP] |+|
|-|| MediaWiki || MySQL import || || |+|
|-|| Transifex ||? || || |+|
|-|| Mailman || MySQL import via mlstats || || |+|
|-|| IRC ||? || || |+|
|-|| Forum || [http://forum. meego. com/stats/ CSV import] || || |+|
|-|| Web analytics ||? || || |+|
|-|| git || Via gitdm? || || |+|
Revision as of 09:42, 15 February 2011
The goal is to provide a web page summarising metrics about various aspects of the MeeGo project. The data should update regularly - depending on the metric, that could be real time or updated automatically on a regular basis.
The dashboard will track the following community resources, ideally:
- Drupal members
- Mailing lists (members, posts, threads)
- gitorious (commits, employer details for committers) - should use Jon Corbet's scripts like are used in the LF yearly kernel data.
- Wiki (edits, new pages)
- Forums (members, posts)
- IRC (total comments, people on channel)
- Transifex (Languages, translators, strings translated)
- Community OBS (uploads, users)
- SDK downloads (potentially extrapolated from meego.com)
The data should also be available for custom reports for usage and analysis in the monthly MeeGo Metrics report published by User:DawnFoster
To fulfill these goals, the dashboard will gather data from the various resource into a centralised database, using some sort of Business Intelligence platform including ETL for data acquisition and storage, and a reporting service for generating reports and dashboards.. A web page will provide a view into this database with predefined reports.
Candidate reporting solutions:
The following are essentially ETL engines, and do not provide reporting or dashboard functionality:
MuleSoft is an open source ESB, but does not seem adapted to our needs. The field is thus narrowed to Pentaho and JasperReports.
For each community resource, we need to figure out how to get the data into a usable form, and come up with appropriate queries for metrics reports, and finally present the results on a webpage.
Business intelligence engines
The area of Business Intelligence is littered with acronyms. Here's a quick overview of the main ones, and how they all fit together.
- Business Intelligence - general name for any middleware which allows you to query business processes (sales, inventory, etc) and get data overviews from it
- Extract, Transform, and Load - the process if extracting data from a data source (database, screen scraping, text file parsing, whatever), transforming it to a well understood format, and loading it in your BI engine database or data warehouse. Good ETL solutions provide a nice way for you to connect another database and have new data sucked in at regular intervals, define views into the source data store which you can then query within your BI engine, etc. Pentaho's ETL, Kettle, and JasperETL, used by JasperReports, both provide (kind of) straightforward ways to hook into a MySQL database.
- Enterprise Service Bus - a middleware bus providing a unique interface to applications on the front-end and data stores on the back end. Often used to link up many front-end applications (eg. library, student registration, employee payroll, syllabus management, accounting, supply-chain, student lodgement programmes, etc in a university). Not really useful for us, as far as I can tell.
- Enterprise Application Integration - using software to integrate different applications together. As far as I can tell, this is a meaningless catch-all phrase for anything from kludges to architected business intelligence solutions.
- Data Warehouse. Basically the same thing as a database, as far as I can tell, but bigger and more impressive sounding.
- On-Line Analytical Processing. Commonly used acronym for extracting data via multi-dimensional queries. Databases can be configured to provide the results of this kind of query. As far as I can tell this is mostly a buzzword - an "OLAP database" like Mondrian is basically the same thing as a database. "speed-of-thought" response times indeed.
- Business reporting
- An application which allows a graphical view of a database, and allows you to construct queries interactively, often using drag & drop. The results of these queries can then be plugged into graphing software for presentation in a dashboard.
- Organised presentation of information in a web-page or other similar format allowing an at-a-glance overview of the situation for the data being measured.
So, in short, the community dashboard project will likely use an ETL to plug data into an OLAP server, and then use a business reporting engine to query that data and present it in a dashboard.
Comparison of candidate ETL/reporting
Pentaho is used as the basis of Mozilla's metrics project, and provides a very strong community software option for both the dashboard and for managing the BI server. Since Mozilla metrics work overlaps what we are trying to achieve, particularly their work on SQR, the Software Quality Reports analytic module for Bugzilla and JIRA, Pentaho is my preference for the dashboard project. In general, I have observed that the Pentaho community provides very good support.
Pentaho runs as a webapp in Tomcat6. It can use a variety of databases for its internal data structures, the default (Hypersonic) is a Java database. However, because it's both standard & well understood and to allow consolidation of databases under one DB server, I prefer to use MySQL. The configuration of Pentaho with a MySQL database is a little tricky, but almost all of the steps are covered well in this tutorial.
The data which is useful for metrics will be copied into a local database from each of the services we query. The copying of data will be accomplished by a set of Kettle "xactions", which can be created and edited easily with the Spoon tool.
A number of reports will be generated using the Pentaho Report Designer, including a static HTML/Flash dashboard which will be published regularly. Other reports can be created for the community managers, and a more advanced dashboard, allowing detailed analysis of basic metrics, can be provided via the Community Dashboard Framework.
We will need to see how much load the dashboard will generate on the server. I suspect that it will not be practical to expose the dashboard in public.
For SQL databases, this implies that the server where the dashboard will run should have access to the database server for MediaWiki, Bugzilla, and Drupal.
For the forum, we will integrate the CSV files currently being exported, which provide the basic analytics we need.
Individual mailing lists will be parsed by MLStats. We will use the resulting database directly in the dashboard.
Git repositories will be queried with "git log", and parsed with the parser module from gitdm, before being stored directly in a database. we will be able to run analytics on the results from there. gitdm can also do basic analytics of git logs, and we may decide to simply reuse gitdm's analytics. However, if we want to extend them, we will want to have the raw data.
IRC logs will be parsed with superseriousstats, a PHP command line tool that parses IRC logs and stores the results in an SQL database.
We still need to figure out how to do data interchange with Transifex and OBS. Dimitris tells me that there are already some analytics available on Transifex, and that there is a RESTful API available to query this data.
Data to report
For each of the resources, the following statistics (at a minimum) should be extracted:
Not yet in scope
I have not yet considered how I might get web analytics and download stats.