skip to main content
research-article
Open Access

Engaging Citizens with Open Government Data: The Value of Dashboards Compared to Individual Visualizations

Published:14 October 2022Publication History

Skip Abstract Section

Abstract

The use of individual visualizations for Open Government Data (OGD) has been shown not to be entirely efficient in engaging citizens. Dashboards constitute a promising solution, but how they should be designed and applied in an OGD context remains under-investigated. This article examines whether the use of well-designed dashboards can increase citizen engagement with OGD. To achieve this objective, a literature review on dashboard design principles is conducted. Then, the outputs of this literature review are used to compile a list of 16 dashboard design principles in the context of OGD. Next, we apply these design principles to build the Namur (Belgium) Budget Dashboard (NBDash) in order to provide a practical application of our research and use it for evaluation. Finally, we use NBDash as a use case to evaluate the usefulness of well-designed dashboards compared to individual visualizations through an experimental study. The results of the experimental design study with 108 participants suggest that the implementation of well-designed dashboards can be beneficial in encouraging the use of data on portals. In addition, the selection of meaningful metrics and the use of appropriate visualizations, all organized in a clear presentation, have proven to be the primary factors of successful dashboards.

Skip 1INTRODUCTION Section

1 INTRODUCTION

In recent years, many public data have been published online by governments and local authorities for the purposes of transparency, economic development, and development of new services [13]. In addition, data have been published to enable reuse by citizens and thus increase their interest in participating in the decision-making process concerning the development of their society [4, 5]. The published data are called Open Government Data (OGD) and are mainly available on dedicated portals [6, 7]. In order to facilitate the understanding and reuse of the available data, many portals have offered individual visualizations in addition to the raw data. Individual visualizations refer to graphical displays (such as statistical charts and graphs, diagrams, and maps) used to analyze and interpret indicator and benchmarking data to reveal the structure, pattern, and trends of variables [8]. For instance, domestic exports by area are presented with a default pie chart visualization on the Singapore Open Data Portal.1 Similarly, with the recent COVID crisis, these individual visualizations have been a key pillar of government digitization [9, 10]. Many portals often suggest one individual visualization per dataset and therefore cover only a small part of the information contained in the dataset. This leads to a decrease of the relevance of OGD to citizens as the added value is not clear to them [5, 11]. As a way forward, dashboards constitute a promising approach to present more complex information in a visual manner. Contrary to individual visualizations, dashboards provide a means of collecting and displaying a number of indicators through a common graphic interface [8]. For Few (2006a, p. 34) a “dashboard is a visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance” [12]. Dashboards have often been used in the private sector, with various tools such as business intelligence (BI) tools (e.g., Google Data Studio,2 Tableau,3 Power BI4) or custom solutions (e.g., VizDeck [13], Keshif [14]). They have recently been used in the public sector through city dashboards (e.g., in Dublin,5 London,6 and New York7) that offer an innovative way to present several indicators using multiple city open and private (i.e., collected from private organizations and whose reuse requires a license) data to citizens. In this study, we focus more on dashboards developed using exclusively datasets available on the OGD portals to make predictions and decisions.

However, a major criticism of dashboard systems for the public sector is that they are not able to fulfill the conditions and factors needed to improve citizen engagement as they are subject to many issues [15, 16]. For example, in the case of the London City Dashboard8 and Boston's City Score,9 the following problems were identified: difficulty in understanding and using the dashboard due to the lack of in-depth details on the tasks being tracked, lack of interpretation of the visualizations represented, inexplicable terminology and abbreviations, lack of information about the quality and veracity of the data or how the metrics are calculated, and lack of possibility to collect citizen feedback. The term “citizen engagement with OGD” refers in this study to the fact that citizens can understand, explore, and use OGD independently. Referring to this previous study [11], 15 constructs (9 conditions and 6 factors) have been identified to be able to stimulate citizen engagement. The nine conditions are (1) the availability of a legal and political framework that grants mandate to open up government data, (2) sufficient budgetary resources allocated for OGD provision, (3) the availability of OGD feedback mechanisms, (4) citizens’ perceived ease of engagement, (5) motivated citizens, (6) citizens’ sense of urgency, (7) competition among citizen-led OGD engagement initiatives, (8) the diversity of citizens’ skills and capabilities, and (9) the intensive use of social media. The six factors are (1) democratic culture, (2) the availability of supporting institutional arrangements, (3) the technical factors of OGD provision, (4) the availability of citizens’ resources, (5) the influence of social relationships, and (6) citizens’ perceived data quality. In this study, we focus on the following constructs that are not related to political, financial, or social conditions and factors (e.g., sufficient budgetary resources allocated for OGD provision, motivated citizens, democratic culture, and influence of social relationships) and can therefore be assessed using tools: citizens’ perceived ease of engagement, citizens’ perceived data quality, and diversity of citizens’ skills and capabilities [11].

A solution to dashboard issues is to implement best practices when deploying dashboards. As pointed out by Matheus et al. [2020], the use of dashboards can create transparency, serve as accountability, and stimulate engagement if they are properly designed. Therefore, the first necessary step in this study was to collect the main design principles for dashboards that can facilitate their use. Thus, our first research question (RQ1) is as follows: “What are the dashboard design principles to ensure their success in the OGD context?” To address RQ1, a Systematic Literature Review (SLR) is conducted to identify the design principles of dashboards, and then each design principle obtained from the SLR is linked to the OGD context. Once this is answered, we seek to understand whether the use of dashboards that incorporate the recommended design principles can improve citizen engagement with OGD over individual visualizations, as this question remains unexplored to our knowledge. Our second research question (RQ2) is therefore formulated as follows: “Can the use of well-designed dashboards (i.e., dashboards that incorporate the identified design principles) help citizens to engage more with OGD than individual visualizations?” To address RQ2, we first apply the design principles to build the Namur (Belgium) Budget Dashboard (NBDash) step by step. Then, we use NBDash as a use case to assess whether the use of well-designed dashboards helps citizens to engage more with OGD (i.e., citizens to understand, explore, and use OGD independently) than individual visualizations proposed on the Namur (Belgium) open data portal. The experimental design study was conducted with 108 participants with different profiles, ranging from the youngest (18–29) to the oldest (50+), and with education level from high school to PhD. The contribution of this article is threefold. First, it provides dashboard designers with a list of best practices for deploying dashboards in the OGD context. Second, it applies the design principles to build a usable tool that can be used as a basis to develop other dashboards. Third, it suggests that citizens prefer to use well-designed dashboards over individual visualizations to engage with (i.e., use or explore) OGD.

The remainder of this article is organized as follows. Section 2 presents the systematic literature review approach and its outputs. Section 3 presents the methodology of NBDash. Section 4 describes the proposed dashboard NBDash and its impact on citizen engagement. Section 5 presents the discussion and limitations of this study and then proposes some avenues for future work. Finally, Section 6 provides a conclusion that summarizes the contributions of this article.

Skip 2SYSTEMATIC LITERATURE REVIEW Section

2 SYSTEMATIC LITERATURE REVIEW

In this section, we first describe the systematic literature review approach used to identify the design principles of dashboards. Then, we present the outputs of the SLR.

2.1 Systematic Literature Review Approach

We carried out an SLR based on the established procedure [18] to access existing knowledge on the design principles of general, city, and OGD dashboards. We do not limit the SLR to the OGD context only, as we believe that the design principles of general and city dashboards can be applied to OGD dashboards, since they also sometimes use open data. In order to cover as many relevant publications as possible, we searched the following databases: Scopus,10 Science Direct,11 and Association for Computing Machinery (ACM).12 With the aim of automating the search in the selected databases, the following search string was constructed using the combination of keywords from our research question RQ1: (“*dashboard” OR “*visual*”) AND (“design”) AND (“principle” OR “practice” OR “guideline”) AND (“open government data” OR “open data” OR “government data” OR “public government data” OR “public data” OR “public sector information”). The search string was later customized based on the requirements of each database. The term “visual*,” which can cover, for example, “visualization” or “visual,” was added to the search terms because a dashboard is a collection of visualizations, and, therefore, visualization best practices can also be applied to dashboards. Based on the automated search, we obtained 274 articles. We then identified relevant articles in three stages: first, we evaluated the type, domain, and title; second, we examined the abstract; and finally, we scanned the content (see Figure 1).

Fig. 1.

Fig. 1. Filter processes applied in the systematic literature review (SLR) accompanied with the number of remaining papers for each step.

In the first stage, 237 publications meeting one of the following criteria were excluded from the review: duplicate papers, studies with titles that are not relevant to the keywords from RQ1, studies published in the health sector or mathematics, and studies not written in English. In the second stage, we excluded 18 publications that we deemed irrelevant to dashboard or visualization design principles because none of these terms were mentioned in their abstracts. In the third stage, we excluded six irrelevant publications because they did not provide any design principles and were more focused on describing a proposed platform (or) used for a specific domain (e.g., health, survey, learning analytic dashboards). We also applied the forward and backward search [19] by examining the references and citations of selected articles and added four additional relevant publications. In the end, we retained 17 articles that empirically explored visualization or dashboard design principles. In addition to these 17 articles, we identified 7 articles from the gray literature using the Google search engine. The retained articles are listed in Appendix A.1.

2.2 Systematic Literature Review Outputs: Dashboard Design Principles (RQ1)

Based on the review of the selected papers, we were able to identify dashboard design principles. In this section, we first briefly explain the different stages of the dashboard data cycle and then present dashboard design principles collected from the literature review and applied in the OGD context.

Dashboard data cycle. Figure 2 illustrates the different steps considered when deploying dashboards [17, 20]. There are seven steps in the process of deploying dashboards: metrics choice (defining the metrics to include in dashboards), data collection (collecting datasets), data processing (cleaning and transforming datasets), data analysis (analyzing datasets to get clear details about their content), building the dashboard (creating adequate visualizations for each defined metric, building the dashboard layout, and integrating visualizations in the layout), and deployment (sharing the created dashboard to citizens). After the deployment, the citizens can then use the dashboard and provide feedback or the dashboards can be customized by adding more datasets or changing metrics.

Fig. 2.

Fig. 2. Dashboard data cycle with associated design principles.

Table 1 provides a summary of the dashboard design principles from the literature review with their application in the OGD context.

Table 1.
Design PrincipleApplication in the OGD Context
P1. Pick meaningful metrics [15, 2125]Using meaningful and understandable indicators that are relevant to citizens helps them to understand the utility of the implemented government dashboards, creates added value, and increases their usability.
P2. Collect accurate and precise data [17, 23, 26, 27]Data quality and lack of understanding of published data are among the reasons that open data are not used [28]. Government dashboards should therefore avoid using ambiguous or unreliable data and metadata in order to make the data easier to understand and to reassure users of its veracity.
P3. Ensure your data makes sense [15, 17]A lot of open data is made available online and integrated into dashboards after aggregating some initial data. It is therefore very important to ensure that the data is consistent before and after the transformation process in order to ensure the quality and the veracity of data.
P4. Consider audience [22, 2426, 29, 30]Referring to previous studies [29, 3133], there are users from different backgrounds who are interested in using open data. Thus, since not all users have the same technical skills to understand data and visualization, it is important to consider the target audience before implementing government dashboard or, if possible, to propose different dashboards according to the type of user, as was done in Dublin dashboard [29].
P5. Use best visualization practices [16, 17, 2127]According to survey results on visualizations for OGD [34], citizens like to have simple and attractive visualizations that can help them to understand the data without much effort. For example, simply arranging the data in descending order in a bar chart can help users quickly grasp the highest and lowest values of the data presented. More details on the design principles of visualizations can be found in [32, 3436].
P6. Use the right type of chart [16, 17, 21, 22, 2427]According to previous studies [34, 3740], visualizations are very useful for understanding the information in the data, but they should be carefully chosen according to the types of data and the target audience in order to be easily understandable. So, for example, using a treemap to show population distribution in a city can be very useful for journalists, but can be difficult to understand for ordinary citizens who are not familiar with visualization. Thus, in government dashboard, these parameters should be taken into account before choosing the visualization for a specific metric.
P7. Provide easy-to-use tools [15, 17, 26]Only a few cities and open data portals provide dashboards for their data and the reasons are that they do not have sufficient technical skills, time, and financial resources to build and update these dashboards [11]. In order to overcome that issue, one solution would be to provide free and easy-to-use tools that could help them to create their dashboards and thus facilitate the understanding of published data by citizens and improve their reuse.
P8. Clear presentation [1517, 2123, 25, 26, 30]

The layout and the presentation of the dashboard are very important for its success and should be also adapted to the device screen size (e.g., phone, tablet, PC) [30].

When presenting government dashboards, it is recommended to display key information at the top of the screen (using, for example, single value charts or a single value with indicator) followed by more details (using advanced visualizations: e.g., line chart, bar chart) [17].

It is also recommended to group together items within the same domain in case the dashboard covers different topics, as is the case in government dashboards. More common mistakes to avoid a messy and unclear presentation are presented in [16, 25, 30].

In addition, some tips about UX and UI design principles such as avoiding unnecessary elements and being clear in the language used on labels and in messaging; considering the spatial relationships between items on the page and structuring the page based on importance; using different sizes, fonts, and arrangement of the text to help to increase scalability, legibility, and readability; etc. are presented in [4143].

P9. Provide context and data interpretation support [21, 25, 26, 29]Not all citizens have the ability to easily understand visualization, so it is important to have additional contextual information or metadata to clarify the meaning of data and to avoid misinterpretation for each visualization presented on dashboard governments. In addition, each visualization should have a title and titled axes.
P10. Think about data literacy levels [24, 26, 29]As mentioned earlier, one of the reasons for the lack of use of open data is the lack of clarity in the metadata provided on open data portals. Therefore, dashboard governments need to use clear and consistent terminology and familiar words, phrases, and concepts to explain their purpose and then allow citizens from different backgrounds to understand the dashboards.
P11. Ensure data is up to date [15, 17, 26]The government dashboards are mainly used to verify some information and then make decisions. Therefore, it is very important to let the citizens know that the data is current or when it was extracted, which can really help the citizens to make a correct decision.
P12. Allow access to data source [15, 17, 24]Many government dashboards sometimes use data that is collected from external organizations and thus not accessible and sometimes also provide no link to access the data used. Allowing access to data sources will enable customization and increase user confidence in the implemented dashboard.
P13. Check for personal data/outliers [15, 17, 24]Since much of governments’ data is obtained by collecting citizen data, they need to ensure the confidentiality of the data when using it in dashboards, for example, by aggregating data and avoiding the risk of the data becoming identifiable in any way. This will then help to solve the data privacy issue observed in dashboards.
P14. Interaction support [17, 21, 24, 25]Government dashboards should offer interaction features such as the ability to hover over an item in the visualizations to get more details, the ability to use filters to update data in the dashboards, and the ability to dig deep into certain trends, metrics, or insights with ease, instead of displaying static visualizations in dashboards that can be good for novice users but not for some less advanced and advanced users such as journalists, or developers, who request more interaction. Thus, it is important to add interaction to support different types of users.
P15. Ensure feedback support [17, 25, 30]One factor influencing citizen engagement with OGD is the availability of OGD feedback mechanisms. Therefore, government dashboards should never stop evolving and allow users to provide feedback through user testing during development or after deployment that can be later used to improve the layout, functionality, look, and feel of the dashboard to ensure optimal value at all times. In addition, a feedback feature that can help citizens to report suspected fraud or corruption to an independent and trusted agency can be a good element to encourage citizens to use dashboards.
P16. Customization [24, 26, 29]Citizen reuse is one of the open data initiatives. Government dashboards should provide citizens with all the information needed to customize existing dashboards in order to facilitate reproducibility that can improve trust and also allow citizens to create an improved version of dashboards without contacting the dashboard manager.

Table 1. Dashboard Design Principles in the OGD Context

Skip 3NBDash OVERVIEW AND EVALUATION METHOD Section

3 NBDash OVERVIEW AND EVALUATION METHOD

After gathering the list of dashboard design principles, we applied them to build the NBDash in order to have an example of a well-designed dashboard for the participants to use during the evaluation. NBDash is a web application built using two budget datasets (Namur-Ordinary Budget by function13 and Namur-Extraordinary Budget by function14) available on the Namur open data portal.15 It is the official open data portal of the city of Namur (Belgium) created in 2018 with several objectives of information, statistics, creation of useful applications for citizens, and transparency. As of August 2021, the portal contains 174 datasets across 13 topics such as urbanism, population, administration, transport, culture, environment, health, sport, energy, economy, education, internal data, and closed data (data that can be accessed by a specific group of users). These datasets are either statistics, which can be consulted online, or batches of data, which can be used directly for the creation of applications. We selected the budget datasets for two reasons. First, transparency-related datasets in general (and budget datasets in particular) are very interesting for citizens [34, 44, 45]. Second, creating a dashboard with these datasets can allow citizens to see how the budget of the municipality is dispatched and can then increase transparency. In addition, we chose the portal of Namur as there is no budget dashboard available for this municipality and access with key stakeholders of this portal was possible. More details on the development of NBDash will be presented in Section 4.1.

An evaluation was later conducted to determine whether citizens prefer to use well-designed dashboards rather than the individual visualizations offered on a traditional OGD portal that do not fully incorporate some of the identified design principles (such as feedback support, data literacy levels, and data interpretation) to engage with OGD. We adopted an experimental design based on the static-group comparison model and thus divided the participants into two groups [46] (p. 12) (the profile of participants in each group is presented in Section 4.2 and further details on participants' recruitment are provided below). The first group (control group) evaluated only the individual visualizations presented on the Namur portal for both budget datasets (see Figure 3). The second group (treatment group) evaluated only the NBDash dashboard. The participants’ feedback was collected through a survey consisting of 16 questions with a 5-point Likert scale (from “Totally Disagree” to “Totally Agree”) accompanied with a free text to justify their ratings and three additional questions to collect demographic data (age, gender, education). At the beginning of the questionnaire where the context of the survey was presented, we mentioned that these two datasets were used as illustrations and the participants' evaluation should be based not only on these specific datasets but also on dashboards and individual visualizations in general.

Fig. 3.

Fig. 3. Individual visualizations proposed on Namur portal for the ordinary and extraordinary budget datasets (translated to English using Google translate). (Left) Average ordinary service revenue by function over time. (Right) Total extraordinary expense by function over time.

The first 15 questions were carefully constructed to correspond to the following constructs (conditions and factors) that have been proven to impact citizen engagement in [11] and also to verify the implementation of the defined design principles: (C1) citizens’ perceived ease of use (refers to effort expectancy or citizens’ perceived ease of engagement in [11]), (C2) diversity of citizens’ skills and capabilities, (C3) citizens’ perceived data veracity and quality (e.g., accuracy, completeness, timeliness [47, 48]; see example in Section 3 of Figure 4(a)). The last question Q16 was added to gather the general opinion on RQ2 and thus to verify whether the use of dashboards or visualizations encourages citizens to engage more with OGD. We select these 3 constructs among the 15 constructs identified from [11] because these constructs are independent of the political, financial, or social conditions and factors (e.g., citizen motivation and citizens’/government resources) and can therefore be evaluated using tools. The list of questions of the surveys are presented in Appendix A.2. Table 2 summarizes the corresponding questions for each factor, explains why we chose them, and provides references where appropriate. Two surveys (one for visualizations and one for dashboards) were distributed to collect feedback from participants using dragnsurvey16 and were pretested with two citizens to ensure that all kinds of errors associated with survey research are reduced [49]. Next, we shared the surveys on social networks such as Facebook groups and Twitter to recruit participants. We later used Amazon Mechanical Turk17 to recruit other participants. We do not set any conditions that participants must meet in order to complete the survey, except that they must properly justify their choices. We chose this option because we want to have different profiles among the participants. Amazon Mechanical Turk was used because after 1 month of posting the surveys on social networks, we observed that only three participants responded to the surveys and this tool has proven effective and reliable in previous studies [50, 51]. Each participant received a $1.5 compensation for completing the survey, which takes about 15 minutes.

Fig. 4.

Fig. 4. NBDash interface for different display types.

Fig. 4.

Fig. 4. Continued.

Fig. 4.

Fig. 4. Continued.

Table 2.
ConstructsQuestionsComments
(C1) Citizens’ perceived ease of use10 questions of the SUS questionnaire (Q1 to Q10)SUS was used as it is suitable to measure the usability of a system in a standalone and also to compare the usability of multiple systems [52, 53]
(C2) Diversity of citizens’ skills and capabilitiesQ11. I can easily tell what we can learn from the datasets based on [these visualizations/this dashboard] (refer to P8, P9)These questions were constructed with reference to the design principles that need to be covered to consider that a system takes into account the skills of end-users (P4. Consider audience). We have mainly based on the following design principles to formulate these questions: P8. Clear presentation, P9. Provide context and data interpretation support, P10. Think about data literacy levels, and P16. Customization
Q12. I can easily draw conclusions based on [these visualizations/this dashboard] (refer to P8, P9)
Q13. I can easily understand [these visualizations/this dashboard] (refer to P10)
Q14. I can easily modify or customize [these visualizations/this dashboard] to see other aspects of the datasets (refer to P16)
(C3) Citizens’ perceived data veracity and quality (citizens’ confidence in the veracity of data and ease of access to data quality)Q15. [These visualizations/this dashboard] provide(s) me necessary information to verify that the data used are accessible, accurate, and up to date and to easily access the quality of the datasets (refer to P2, P12)This question was constructed with reference to design principles regarding data quality and veracity (P2. Collect accurate and precise data, P12. Allow access to data source)

Table 2. Correspondence between Citizen Engagement Constructs and Survey Questions

Skip 4RESULTS: THE IMPACT OF NBDash ON CITIZEN ENGAGEMENT (RQ2) Section

4 RESULTS: THE IMPACT OF NBDash ON CITIZEN ENGAGEMENT (RQ2)

In this section, we first describe the development of NBDash. Then we present the results of the evaluation.

4.1 NBDash: System Description

NBDash is a web application (source code available at https://github.com/chokkipaterne/nbdash) built using three technologies: Python as the programming language, Pandas18 as the data processing library, and Dash Plotly19 to create and display the visualizations on the web page. The deployment of NBDash was done after following each step of the deployment process shown in Figure 2 and applying the associated design principles for each step.

Metrics Choice and Data Collection. In this step, we defined metrics with reference to existing metrics in the London budget dashboard20 and validated these metrics by gathering feedback from two citizens (P1). The metrics were also categorized based on the users’ skills in order to display only necessary and understandable metrics to the users (see Appendix A.3) (P4). Later, we collected the latest update of the budget datasets related to the selected metrics on the reliable open data portal of Namur (P2 and P11). We also provided access links to the datasets to users for possible reuse (P12) and calculated the data quality of each dataset based on the basic features: missing values, data information, and metadata information (column titles and descriptions) [54], to indicate to users the quality of the data (see Section 3 of Figure 4(a)) (P2).

Data Processing and Data Analysis. In this step, we used Excel formulas in parallel to Pandas to aggregate and filter data according to each metric in order to ensure that we get the same results for both options (P3). Data aggregation was also used to ensure that personal data is not disclosed (P13).

Build Dashboard and Deployment. In this step, we implemented three display types in order to accommodate for the skills of the audience, which vary from novice (low knowledge in visualizations) to advanced users (high knowledge in visualizations) [29]. The display types are as follows: “simple” display for novice users on visualizations (see Figure 4(c)), “less advanced” display for users who need more control over the data and visualizations displayed on dashboard (see Figure 4(b)), and “advanced” for users who want to customize the dashboard (see Figure 4(a)) (P4). For each display type, we organized the metrics that represent the big picture of the data on the top of the dashboard, followed by the metrics that provide more details (see Sections 7 and 8 of Figure 4(a)) (P8). Three main visualization types were used to represent the selected metrics: a bar chart with descending sorting and single color for the representation of categorical data (see Section 8 of Figure 4(a)), a pie chart for the representation of proportions, and a line chart for the temporal data (P5 and P6). In order to help users to understand the graphs and avoid misinterpretation, we provided the chart title, axes titles, and a small interpretation for the graphs (see Figure 4(c)) (P9). We noticed that datasets contained many financial terms. To help users understand key aspects of the graphs, we created a terminology section to explain key budget terms (see Section 2 of Figure 4(a)) and also used easy-to-understand words to represent title, axes, and interpretation of graphs and to design the layout of the dashboard (P10). In addition, we also added filters to the dashboard (e.g., fiscal year, function, budget type) to allow users to update the data displayed and change the visualization type (see Sections 6 and 9 of Figure 4(a)) in order to support interaction (P14).

Feedback/Customization. In this step, we provided a feedback form to users and allowed them to track the status of their feedback (see Section 4 of Figure 4(a)) (P15). We also added an “Edit chart” button to allow users to directly edit the represented graph in the chart studio of Plotly21 and also provided access to source code that can be used to enhance NBDash or create a new dashboard (see Sections 5 and 8 of Figure 4(a)) (P16).

Figure 4 shows the interface of NBDash for the three different display types. The layout is nearly the same for all display types, with the exception of Sections 5, 8, and 9, which are slightly modified depending on the display type. Table 3 details the difference between these three display types.

Table 3.
CriteriaDisplay Types in NBDash
NoviceLess AdvancedAdvanced
User typeUsed by users who have a low level of visualization knowledge such as the general public.Used by users who have a middle level of visualization knowledge such as public servants using visualization often in their work.Used by users who have a high level of visualization knowledge such as developers.
Section 5The “access source code” button is not visible.The “access source code” button is not visible.The “access source code” button is visible.
Section 8We show only a bar chart.The “edit chart” menu is not visible.We show the selected chart (bar, line, or pie) followed by the data in table format.The “edit chart” menu is not visible.We show the selected chart (bar, line, or pie) followed by the data in table format.The “edit chart” menu is visible.
Section 9We hide the advanced filters. We also display only a small interpretation of the chart.We show the advanced filters (e.g., fiscal year, function, budget type) to allow users to update the data displayed and change the visualization type.We hide the chart interpretation.We show the advanced filters (e.g., fiscal year, function, budget type) to allow users to update the data displayed and change the visualization type.We hide the chart interpretation.
Other sections (1–4, 6, 7)Other sections remain the same except for the text about the display type in Section 1, which is updated according to the chosen display type.

Table 3. Difference between the Different Display Types in NBDash

4.2 Insights from the Experimental Study

Through questionnaires that participants completed after exploring the visualizations on the Namur Portal and NBDash, we were able to gather their opinions on RQ2. A total of 50 participants completed the survey on visualizations and 58 completed the survey on NBDash. A minimum of 50 participants were recruited for each group because when referring to previous studies [5558], using 5 to 50 participants for comparison or usability tests is a good baseline. Table 4 presents the demographic representation of participants for both surveys. We did not observe a significant relationship between this demographic information and participants' choices. We will therefore not discuss it further in this article.

Table 4.
Group 1 (Visualizations) 50 participantsGroup 2 (NBDash) 58 participants
SexMale31 participants (62%)39 participants (67.24%)
Female19 participants (38%)19 participants (32.76%)
Other0 participants (0%)0 participants (0%)
Age18–2921 participants (42%)22 participants (37.93%)
30–4925 participants (50%)32 participants (55.17%)
50+4 participants (8%)4 participants (6.9%)
EducationNone0 participants (0%)0 participants (0%)
Primary0 participants (0%)0 participants (0%)
High School3 participants (6%)3 participants (5.17%)
High Education39 participants (78%)50 participants (86.2%)
PhD8 participants (16%)5 participants (8.63%)

Table 4. Demographic Data of Surveys

As the responses were provided in an ordinal scale, we used the average (avg) with standard deviation (σ) to evaluate the collected answers. We also used a one-way ANOVA test (especially the p-values) to verify the statistical significance of the answers between the two groups. Table 5 presents the corresponding p-values, averages, and standard deviations of the responses for each group. The visualization group is referred to as group 1, and the dashboard group is referred to as group 2. The results of Table 5 show that NBDash offers greater usability than individual visualizations, as its average SUS score (76.85) is higher than 68, which is the minimum required [52] and also greater than the average SUS score of individual visualizations. The results also show that there is significant variation between the SUS scores of participants in the two groups (σ ≥ 11). On the basis of this first part of the results, we can therefore deduct two things: (1) citizens perceived that the well-designed dashboard is easy to use and (2) citizens perceived that the well-designed dashboard is easier to use than individual visualizations, which have a usability score below the threshold. Regarding statements Q11, Q12, and Q13, the results show that the averages of the scores collected on these three questions by the group 2 participants (avg resp. 4.21, 4.31, 4.5) are higher than those collected by group 1 participants (avg resp. 3.98, 4.08, 4.14) and there is no significant variation between the scores (σ ≤ 1 for Q11 to Q13). For statement Q14, more participants in group 1 found it easy to modify or customize the individual visualizations (avg = 4) than participants in group 2 (avg = 3.97). However, there is a smaller difference between these avg scores (diff = 0.03) compared to the previous differences, and there is also no significant variation between the scores (σ ≤ 1 for Q14). Based on the scores of these four statements (Q11 to Q14), we can therefore infer that participants agree that well-designed dashboards take into account the diversity of skills and capabilities rather than individual visualizations. Indeed, participants with different levels of education in group 2 were more likely to be able to easily understand, draw conclusions, and modify (with a smaller difference between the two groups in this aspect) than those in group 1. Regarding the statement about the evaluation of the data quality (Q15), there are more participants in group 2 (avg = 4.23) compared to group 1 (avg = 4.02) who agreed that they have necessary information to evaluate the quality of the datasets. We can therefore deduct that citizens more easily perceived data veracity and quality in using a well-designed dashboard compared to individual visualizations. The results on the statement Q16 show that there are more participants in group 2 (avg = 4.43 (σ ≤ 1)) compared to group 1 (avg = 4.02 (σ ≤ 1)) that agreed that they will be interested to explore, understand, and use (engage with) more data on a portal if the data is presented with this type of dashboard compared to the individual visualizations. We can therefore deduct that citizens would be more interested to engage with OGD if they were represented using a well-designed dashboard rather than individual visualizations. In addition, the results of the p-values show that the differences between NBDash and individual visualizations were statistically significant (p-value ≤ 0.05) for the answers to questions Q11, Q13, and Q16. Based on all these conclusions, we can respond to RQ2 by saying that the use of well-designed dashboards can help citizens to engage more with OGD than individual visualizations.

Table 5.
Visualizations on Namur PortalNBDashp-Value
AvgσAvgσ
SUS Score (Q1 to Q10)66.312.8176.8511.18-
Q113.980.714.210.590.04
Q124.080.94.310.790.16
Q134.140.814.50.540.006
Q1440.863.970.880.83
Q154.020.774.230.830.13
Q164.020.894.430.570.004

Table 5. Average (avg), Standard Deviation (σ), and p-Value of Survey Scores

These observations can be justified by the following reasons. First, according to participants' comments, the visualizations used on NBDash are easier to understand than those on the Namur portal. On the Namur portal, some participants found that the visualizations contained too much data and that the visualization technique used to represent data was not easy to understand. Second, many participants found NBDash to be user-friendly and well organized and therefore easy to use and understand. However, two participants of group 2 disagreed with this statement. They thought that there was a lot of information in the dashboard and too much text. For example, they suggested hiding the sections on terminology, feedback, and data used and only displaying them when the user requests so. Third, participants found that on the Namur portal, they can easily modify the visualizations compared to NBDash. Five participants of group 2 could not figure out how to modify or customize the visualization in NBDash, probably because they were on the “Simple” display while the option to modify the chart is available on the “Advanced” display. Fourth, in NBDash participants perceived that they can more easily access data quality information than in individual visualizations because in NBDash, we clearly specified the data used accompanied with their sources and the last update time and also evaluated the data quality to help users to have an idea about it without accessing them.

Another finding from the participants' comments is that the most important design principles for the participants are the selection of meaningful metrics (P1), the use of appropriate visualization (P5 and P6), and a clear presentation and design layout (P8). This is supported by the fact that many participants justified their ratings on the basis of these four design principles. Other design principles were also important to the participants, as without them we would not achieve a sufficient SUS score. However, the design principle of providing context and assistance in interpreting the data (P9) was somewhat criticized by some participants, as applying this design principle added more text to the dashboard. Therefore, we need to find a compromise between providing more details and providing little details in order to not clutter the dashboard.

Skip 5DISCUSSION Section

5 DISCUSSION

This research contributes to theory and practice in the following aspects. First, this research extends this recent work [11] by using its recommended conditions for the emergence of OGD citizen engagement to propose some design principles that can be incorporated into dashboards to fulfill these conditions. Second, unlike previous studies [1517, 2123, 25, 26, 29, 30] that have focused on design principles for general or city dashboards, it contributes to the knowledge base by proposing 16 design principles with a clear application in the OGD context. Therefore, the design principles provided can be used by dashboard designers and OGD managers to implement usable and understandable dashboards that can then improve citizen engagement with OGD. Third, different from previous studies [1517, 2123, 26, 29], we show through a concrete case study how to apply each design principle. Thus, this case study can be a source of inspiration for dashboard designers and OGD managers to create their own dashboards using OGD. Fourth, we provide access to the source code of the case study. This can be used as a starting point by dashboard designers and OGD managers to create their own dashboards or improve the Namur budget dashboard. Fifth, the usefulness of the dashboards in helping citizens to engage with OGD on the portals was proven on the basis of the evaluation results. We suggest OGD managers to provide more dashboards on their portals and also follow the design principles to make them easy to use and understand.

However, this research has some limitations that will need to be addressed in future work. The first limitation concerns the representativeness of the participants in the evaluation. The use of Amazon Mechanical Turk can be a bias [59, 60], but in our research we tried to minimize this by following best practices [60, 61] such as using strict criteria to select relevant participants and also checking the consistency of the participants’ comments with their ratings before validating their submission because we noticed that some participants were not fluent in English and also sometimes their ratings and their comments did not match. However, to solve this issue, we suggest using other channels of communication or collecting data on-site in administrations, universities, or public places. In this study, this was not feasible due to the COVID-19 situation. The second limitation resides in the use of three of the factors mentioned in [11, 15] that impact citizen engagement to define our design principles. Other factors that were excluded in this study concern, for example, citizen motivation and citizens’/government resources, which we believe communication and financial resources can be used to address. However, other researchers can build on our study and investigate whether there are design principles that can address these remaining factors. The third limitation is the non-implementation of a generic tool for OGD dashboards. Other researchers and programmers can build on the implemented dashboard as well as the proposed design principles to implement a usable tool that can be generic and can help OGD managers to easily create dashboards that follow all of these best design principles. Further research may also involve collecting additional data following the Unified Theory of Acceptance and Use of Technology (UTAUT) [62, 63] model to check whether demographic and social factors actually impact citizen engagement, as we were unable to cover this aspect in this study due to the sample size and distribution. The UTAUT model is suggested because it includes four main constructs, namely performance expectancy, effort expectancy, social influence, and facilitating conditions, while accommodating four moderators: age, gender, voluntariness, and experience. Therefore, it is the most appropriate model for assessing the impact of demographic and social factors compared to other models (e.g., TAM, TOE) that do not incorporate these factors.

Skip 6CONCLUSION Section

6 CONCLUSION

The use of individual visualizations on open data portals has been shown to lack efficiency in reducing the information asymmetry between governments and citizens [5, 11]. Dashboards may be a promising way to address this problem, as individual visualizations only cover little information contained in the dataset, unlike dashboards that often incorporate more details in a single screen. The objective of this article is to identify the design principles of dashboards in the OGD context (RQ1) that facilitate their use and to investigate whether the use of well-designed dashboards can help citizens to engage with OGD (RQ2). To address RQ1, a systematic literature review was conducted, which allowed us to provide 16 design principles applicable to OGD dashboards. There are as follows: (P1) pick meaningful metrics, (P2) collect accurate and precise data, (P3) ensure data quality, (P4) consider audience, (P5) use best visualization practices, (P6) use the right type of chart, (P7) provide easy-to-use tools, (P8) provide clear presentation, (P9) provide context and data interpretation support, (P10) think about data literacy levels, (P11) ensure data is up to date, (P12) allow access to data source, (P13) ensure data privacy, (P14) provide interaction support, (P15) integrate feedback support, and (P16) allow customization. To address RQ2, we developed the Namur Budget Dashboard (NBDash) that implements the mentioned design principles and then compared it to the budget visualizations on the Namur portal in terms of ease of use, diversity of citizens’ skills and capabilities, data veracity and quality, and citizens’ intention to engage.

An experimental study conducted with two groups composed of 50 (for the evaluation of the budget visualizations on the Namur portal) and 58 participants (for the evaluation of NBDash) was conducted to address RQ2 and revealed the following findings. First, citizens perceived that the well-designed dashboard is easier to use and understand than individual visualizations. Second, citizens with different levels of education perceived that they can more easily understand and draw conclusions from well-designed dashboards than individual visualizations. Third, citizens more easily perceived data veracity and quality in using well-designed dashboards compared to individual visualizations. Fourth, citizens would be more interested to engage with OGD if they were represented using well-designed dashboards than individual visualizations. Based on these results, we can answer RQ2 by saying that the use of well-designed dashboards can help citizens to engage more with OGD than individual visualizations. The evaluation results also showed that while all design principles are important for citizen engagement with OGD through dashboards, choosing meaningful measures (P1) and using appropriate visualization (P5, P6) and clear presentation and layout (P8) are the most important.

A APPENDIX

A.1 List of Publications Retained in the Systematic Literature Review (SLR)

Table 6.
Dashboard Design PrinciplesCategory
[15, 17, 21, 23, 25, 26, 29, 30]Public sector
[16, 20, 22, 24, 27]General
Visualization Design PrinciplesCategory
[32, 34]Public sector
[3540]General
UX and UI Design PrinciplesCategory
[4143]General

Table 6. List of 17 Scientific Publications and 7 Gray Literature Contributions Retained in the SLR

A.2 List of Questions in the Surveys

Table 7.
Questions for [Visualizations/Dashboard] to Address Citizen Engagement
Q1. I think that I would like to use [these visualizations/this dashboard] frequentlyTotally Disagree
Q2. I found [these visualizations/this dashboard] unnecessarily complexTotally Agree
Q3. I thought [these visualizations/this dashboard] were easy to use+ Free text to justify their choice
Q4. I think that I would need the support of a technical person to be able to use [these visualizations/this dashboard]
Q5. I found the various functions in [these visualizations/this dashboard] were well integrated
Q6. I thought there was too much inconsistency in [these visualizations/this dashboard]
Q7. I would imagine that most people would learn to use [these visualizations/this dashboard] very quickly
Q8. I found [these visualizations/this dashboard] very difficult to use
Q9. I felt very confident using [these visualizations/this dashboard]
Q10. I needed to learn a lot of things before I could get going with [these visualizations/this dashboard]
Q11. I can easily tell what we can learn from the datasets based on [these visualizations/this dashboard]
Q12. I can easily draw conclusions based on [these visualizations/this dashboard]
Q13. I can easily understand [these visualizations/this dashboard]
Q14. I can easily modify or customize [these visualizations/this dashboard] to see other aspects of the datasets
Q15. [These visualizations/this dashboard] provide(s) me necessary information to verify that the data used are accessible, accurate, and up to date and to easily access the quality of the datasets
Q16. Using this type of [visualization/dashboard] to present data makes me want to engage with (i.e., explore, understand, and use) more data on a portal
Demographic Questions
Q17. How old are you?[18–29/30–49/50+]
Q18. What is your gender?[Female/Male/Other]
Q19. What is your level of education?[None / Primary / High School / High Education / PhD]

Table 7. Survey Questions for the Evaluation

A.3 List of Metrics in NBDash

Table 8.
MetricsDisplay Type
M1. Total [ordinary/extraordinary] [revenue/expense] for specific year with possibility to compare to the previous yearAll (Simple, Less advanced, Advanced)
M2. Analysis of [ordinary/extraordinary] [revenue/expense] by function for specific yearAll
M3. Analysis of [ordinary/extraordinary] [revenue/expense] by [revenue/expense] type for specific yearAll
M4. Analysis of [ordinary/extraordinary] [revenue/expense] by function over timeLess advanced, Advanced

Table 8. Metrics in NBDash Based on the Different Display Types ([Ordinary/Extraordinary] and [Revenues/Expenses] Are Used as Filters)

Footnotes

  1. 1 https://data.gov.sg/.

    Footnote
  2. 2 https://datastudio.google.com.

    Footnote
  3. 3 https://www.tableau.com.

    Footnote
  4. 4 https://powerbi.microsoft.com.

    Footnote
  5. 5 https://dublindashboard.ie/.

    Footnote
  6. 6 https://citydashboard.org/london/.

    Footnote
  7. 7 https://datausa.io/profile/geo/new-york-ny.

    Footnote
  8. 8 https://citydashboard.org/london/.

    Footnote
  9. 9 https://www.boston.gov/innovation-and-technology/cityscore.

    Footnote
  10. 10 https://www.scopus.com/.

    Footnote
  11. 11 https://www.sciencedirect.com/.

    Footnote
  12. 12 https://dl.acm.org/.

    Footnote
  13. 13 https://rb.gy/61r8dk.

    Footnote
  14. 14 https://rb.gy/dpayws.

    Footnote
  15. 15 https://data.namur.be/.

    Footnote
  16. 16 https://www.dragnsurvey.com.

    Footnote
  17. 17 https://www.mturk.com.

    Footnote
  18. 18 https://pandas.pydata.org/.

    Footnote
  19. 19 https://plotly.com/dash/.

    Footnote
  20. 20 http://openbudget.lacity.org/#!/year/default.

    Footnote
  21. 21 https://chart-studio.plotly.com/.

    Footnote

REFERENCES

  1. [1] Janssen M., Matheus R., Longo J., and Weerakkody V.. 2017. Transparency-by-design as a foundation for open government. Transform. Gov. People, Process Policy 11, 1 (2017), 28. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  2. [2] Attard J., Orlandi F., Scerri S., and Auer S.. 2015. A systematic review of open government data initiatives. Gov. Inf. Q. 32, 4 (2015), 399418.Google ScholarGoogle ScholarCross RefCross Ref
  3. [3] Lnenicka M. and Nikiforova A.. 2021. Transparency-by-design: What is the role of open data portals?. Telemat. Informatics 61 (2021), 1--18. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  4. [4] Hivon J. and Titah R.. 2017. Conceptualizing citizen participation in open data use at the city level. Transform. Gov. People, Process Policy 11, 1 (2017), 99118. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  5. [5] Mellouli S., Luna-Reyes L. F., and Zhang J.. 2014. Smart government, citizen participation and open data. Inf. Polity 19, 1–2 (2014), 14. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  6. [6] Lněnička M., Machova R., Volejníková J., Linhartová V., Knezackova R., and Hub M.. 2021. Enhancing transparency through open government data: The case of data portals and their features and capabilities. Online Inf. Rev.. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  7. [7] Erickson J. S., Viswanathan A., Shinavier J., Shi Y., and Hendler J. A.. 2013. Open government data: A data analytics approach. IEEE Intell. Syst. 28, 5 (2013), 1923. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. [8] Kitchin R., Lauriault T. P., and McArdle G.. 2015. Knowing and governing cities through urban indicators, city benchmarking and real-time dashboards. Reg. Stud. Reg. Sci. 2, 1 (2015), 628. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  9. [9] Biswas P. et al. 2020. COVID-19 data visualization through automatic phase detection. Digit. Gov. Res. Pract 1, 4 (2020), 18. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. [10] Röddiger T., Beigl M., Dörner D., and Budde M.. 2021. Responsible, automated data gathering for timely citizen dashboard provision during a global pandemic (COVID-19). Digit. Gov. Res. Pract 2, 1 (2021) 19. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. [11] Purwanto A., Zuiderwijk A., and Janssen M.. 2020. Citizen engagement with open government data: Lessons learned from Indonesia's presidential election. Transform. Gov. People, Process Policy 14, 1 (2020), 130. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  12. [12] Few S.. 2021. Information dashboard design: The effective visual communication of data, 2006. Retrieved April 8, 2021, from https://www.researchgate.net/publication/31860304_Information_Dashboard_Design_The_Effective_Visual_Communication_of_Data_S_Few.Google ScholarGoogle Scholar
  13. [13] Key A., Howe B., Perry D., and Aragon C.. 2014. VizDeck: Self-organizing dashboards for visual analytics. In Proc. ACM SIGMOD Int. Conf. Manag. Data. 681684. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. [14] Yalcin M., Elmqvist N., and Bederson B.. 2018. Keshif: Rapid and expressive tabular data exploration for novices. IEEE Trans. Vis. Comput. Graph. 24, 8 (2018), 23392352.Google ScholarGoogle Scholar
  15. [15] Kitchin R. and Mcardle G.. 2016. Urban data and city dashboards: Six key issues. In Data and the City.Google ScholarGoogle Scholar
  16. [16] Few S.. 2006. Common pitfalls in dashboard design. ProClarity Corp. 31 (2006), [Online]. https://www.perceptualedge.com/articles/Whitepapers/Common_Pitfalls.pdf.Google ScholarGoogle Scholar
  17. [17] Matheus R., Janssen M., and Maheshwari D.. 2020. Data science empowering the public: Data-driven dashboards for transparent and accountable decision-making in smart cities. Gov. Inf. Q. 37, 3 (2020), 101284. DOI:Google ScholarGoogle Scholar
  18. [18] Petersen K., Feldt R., Mujtaba S., and Mattsson M.. 2008. Systematic mapping studies in software engineering. In 12th Int. Conf. Eval. Assess. Softw. Eng. (EASE’08). DOI:Google ScholarGoogle ScholarCross RefCross Ref
  19. [19] Webster J. and Watson R. T.. 2002. Analyzing the past to prepare for the future: Writing a literature review. MIS Q. 26, 2 (2002), xiiixxiii. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  20. [20] Smith V. S.. 2013. Data dashboard as evaluation and research communication tool. In Data Visualization, Part 2. New Directions for Evaluation. Azzam T. and Evergreen S. (Eds.), no. 140, 2145.Google ScholarGoogle Scholar
  21. [21] Maheshwari D. and Janssen M.. 2014. Dashboards for supporting organizational development: Principles for the design and development of public sector performance dashboards. In Proc. 8th Int. Conf. theory Pract. Electron. Gov. 178185. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. [22] Brath R. and Peters M.. 2004. Dashboard design: Why design is important. DM Review Online.Google ScholarGoogle Scholar
  23. [23] Ganapati S.. 2011. Use of dashboards in government. In Fostering Transparency and Democracy Series.Google ScholarGoogle Scholar
  24. [24] Sarikaya A., Correll M., Bartram L., Tory M., and Fisher D.. 2019. What do we talk about when we talk about dashboards? IEEE Trans. Vis. Comput. Graph. 25, 1 (2019), 682692. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. [25] Durcevic S.. 2021. Top 20 dashboard design principles. In Best Practices & How To's. Retrieved August 19, 2021, from https://www.datapine.com/blog/dashboard-design-principles-and-best-practices/.Google ScholarGoogle Scholar
  26. [26] Young G. W. and Kitchin R.. 2020. Creating design guidelines for building city dashboards from a user's perspectives. Int. J. Hum. Comput. Stud 140, (2010), 102429. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  27. [27] Janes A., Sillitti A., and Succi G.. 2013. Effective dashboard design. Cut. IT J. 26, 1 (2013), 1724.Google ScholarGoogle Scholar
  28. [28] Crusoe J., Simonofski A., Clarinval A., and Gebka E.. 2019. The impact of impediments on open government data use: Insights from users. In 13th Int. Conf. Res. Challenges Inf. Sci. 112. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  29. [29] Young G. W., Kitchin R., and Naji J.. 2021. Building city dashboards for different types of users. J. Urban Technol 28, 1–2 (2021), 289309. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  30. [30] Tableau. 2021. 10 Best Practices for Building Effective Dashboards. Retrieved August 19, 2021, from https://www.tableau.com/learn/whitepapers/10-best-practices-building-effective-dashboards.Google ScholarGoogle Scholar
  31. [31] Lassinantti J., Ståhlbröst A., and Runardotter M.. 2019. Relevant social groups for open data use and engagement. Gov. Inf. Q. 36, 1 (2019), 98111. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  32. [32] Graves A. and Hendler J.. 2014. A study on the use of visualizations for open government data. Inf. Polity 19, 1–2 (2014), 7391. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  33. [33] Nikiforova A. and Lnenicka M.. 2021. A multi-perspective knowledge-driven approach for analysis of the demand side of the open government data portal. Gov. Inf. Q. 38, 4 (2021), 101622. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  34. [34] Chokki A. P., Simonofski A., Frénay B., and Vanderose B.. 2021. Open government data for non-expert citizens: Understanding content and visualizations’ expectations. Res. Challenges Inf. Sci. 415 LNBIP (2021), 602608. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  35. [35] Shneiderman B.. 1996. Eyes have it: A task by data type taxonomy for information visualizations. In IEEE Symp. Vis. Lang. Proc. 336343. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  36. [36] Heer J. and Shneiderman B.. 2012. A taxonomy of tools that support the fluent and flexible use of visualizations. Interact. Dyn. Vis. Anal. 10 (2012), 126, [Online]. Available: http://queue.acm.org/detail.cfm?id=2146416.Google ScholarGoogle Scholar
  37. [37] Holtz Y. and Conor H.. 2021. From Data to Viz, 2018. Retrieved May 17, 2021, from https://www.data-to-viz.com/.Google ScholarGoogle Scholar
  38. [38] Munzner T.. 2014. Visualization Analysis and Design.Google ScholarGoogle Scholar
  39. [39] Ribecca S.. 2021. The data visualisation catalogue. In The Data Visualisation Catalogue. Retrieved April 21, 2021, from http://www.datavizcatalogue.com/.Google ScholarGoogle Scholar
  40. [40] Wilke C. O.. 2019. Fundamentals of Data Visualization: A Primer on Making Informative and Compelling Figures.Google ScholarGoogle Scholar
  41. [41] Usability.gov. 2021. User Interface Design Basics. Retrieved August 19, 2021, from https://www.usability.gov/what-and-why/user-interface-design.html.Google ScholarGoogle Scholar
  42. [42] UXPin. 2021. The Basic Principles of User Interface Design. Retrieved August 19, 2021, from https://www.uxpin.com/studio/blog/ui-design-principles/.Google ScholarGoogle Scholar
  43. [43] Davies N.. 2021. The 7 Principles of UX design—and How to Use Them - 99designs. Retrieved August 19, 2021, from https://en.99designs.be/blog/web-digital/ux-design-principles/.Google ScholarGoogle Scholar
  44. [44] Araújo A. C., Reis L., and Sampaio R. C.. 2016. Do transparency and open data walk together? An analysis of initiatives in five Brazilian capitals. Media Stud. 7, 14 (2016), 6583. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  45. [45] Corrêa A. S., Corrêa P. L. P., and Da Silva F. S. C.. 2014. Transparency portals versus open government data. An assessment of openness in Brazilian municipalities. In Proc. 15th Annu. Int. Conf. Digit. Gov. Res. 178185. DOI:Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. [46] Campbell D. T. and Stanley J. C.. 1969. Experimental and Quasi-Experimental Designs for Research. Chicago, Rand McNaUy.Google ScholarGoogle Scholar
  47. [47] Dekkers M., Loutas N., De Keyzer M., and Goedertier S.. 2014. Open data & metadata quality. Open Data Support. Retrieved August 23, 2021, from https://joinup.ec.europa.eu/sites/default/files/document/2015-05/d2.1.2_training_module_2.2_open_data_quality_v1.00_en.pdf.Google ScholarGoogle Scholar
  48. [48] Nikiforova A.. 2020. Timeliness of open data in open government data portals through pandemic-related data: A long data way from the publisher to the user. In 4th Int. Conf. Multimed. Comput. Netw. Appl. 131138. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  49. [49] Grimm P.. 2010. Pretesting a questionnaire. In Wiley International Encyclopedia of Marketing. John Wiley & Sons, Ltd., 2010.Google ScholarGoogle Scholar
  50. [50] Crowston K.. 2012. Amazon Mechanical Turk: A research tool for organizations and information systems scholars. IFIP Adv. Inf. Commun. Technol. 389 AICT (2012), 210221. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  51. [51] Berinsky A. J., Huber G. A., and Lenz G. S.. 2012. Evaluating online labor markets for experimental research: Amazon.com's Mechanical Turk. Polit. Anal. 20, 3 (2012), 351368. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  52. [52] Brooke J.. 1986. SUS - A quick and dirty usability scale. Usability Eval. Ind. (1986). DOI:Google ScholarGoogle ScholarCross RefCross Ref
  53. [53] Brooke J.. 2013. SUS: A retrospective. J. Usability Stud. 8, 2 (2013), 2940.Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. [54] Vetrò A., Canova L., Torchiano M., Minotas C. O., Iemma R., and Morando F.. 2016. Open data quality measurement framework: Definition and application to open government data. Gov. Inf. Q. 33, 2 (2016), 325337. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  55. [55] Faulkner L.. 2003. Beyond the five-user assumption: Benefits of increased sample sizes in usability testing. Behav. Res. Methods, Instruments, Comput. 35 (2003), 379383.Google ScholarGoogle ScholarCross RefCross Ref
  56. [56] Nielsen J.. 2021. Why you only need to test with 5 users. Nielsen Norman Group, 2000. Retrieved June 17, 2021, from https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/.Google ScholarGoogle Scholar
  57. [57] Six J. M. and Macefield R.. 2021. How to determine the right number of participants for usability studies. UXmatters, 2016. Retrieved June 17, 2021, from https://www.uxmatters.com/mt/archives/2016/01/how-to-determine-the-right-number-of-participants-for-usability-studies.php.Google ScholarGoogle Scholar
  58. [58] Alroobaea R. and Mayhew P. J.. 2014. How many participants are really enough for usability studies? In Science and Information Conference 2014, 4856. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  59. [59] M. Dupuis, B. Endicott-Popovsky, and R. Crossler. 2013. An analysis of the use of Amazon's Mechanical Turk for survey research in the cloud. In Proceedings of the International Conference on Cloud Security Management: ICCSM 2013. 10.Google ScholarGoogle Scholar
  60. [60] Ehrich K.. 2021. Mechanical Turk: Potential concerns and their solutions. Summit, 2020. Retrieved June 18, 2021, from https://www.summitllc.us/blog/mechanical-turk-concerns-and-solutions.Google ScholarGoogle Scholar
  61. [61] Cobanoglu C., Cavusoglu M., and Turktarhan G.. 2021. A beginner's guide and best practices for using crowdsourcing platforms for survey research: The case of amazon mechanical turk (MTurk). J. Glob. Bus. Insights 6, 1 (2021), 9297. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  62. [62] Venkatesh V., Morris M. G., Davis G. B., and Davis F. D.. 2003. User acceptance of information technology: Toward a unified view. MIS Q. Manag. Inf. Syst 27, 3 (2003), 425478. DOI:Google ScholarGoogle Scholar
  63. [63] Momani A. M.. 2020. The unified theory of acceptance and use of technology: A new approach in technology acceptance. Int. J. Sociotechnology Knowl. Dev 12, 3 (2020), 7998. DOI:Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Engaging Citizens with Open Government Data: The Value of Dashboards Compared to Individual Visualizations

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image Digital Government: Research and Practice
      Digital Government: Research and Practice  Volume 3, Issue 3
      July 2022
      94 pages
      EISSN:2639-0175
      DOI:10.1145/3561951
      Issue’s Table of Contents

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 14 October 2022
      • Online AM: 22 August 2022
      • Accepted: 9 August 2022
      • Revised: 7 July 2022
      • Received: 23 December 2021
      Published in dgov Volume 3, Issue 3

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format