Tuesday, April 30, 2019

MoPOP Evaluation

http://www.informalscience.org/sites/default/files/MoPOP_Full%20Evaluation%20Report_Final.pdf
I was interested in this study because I got to visit the MoPOP two years ago, during a very small after hours event, and it was a great experience.

The goal of this study was to find out what visitors think of pop culture and the MoPOP and to see how those perceptions changed after a visit to the museum. Their secondary goal was to find out who the main MoPOP visitors are and to find out their level of satisfaction from their experience.

Their method of evaluation was to visitors entering and exiting the museum pre and post-visit facilitated questionnaires. The questionnaires were given to every third person that entered the museum and any guest who exited the museum. The questionnaires were not given to the same audiences so their data from the pre and post-visit questionnaires do not directly correlate.
The questionnaires had both open and closed-ended questions, for both questionnaires a facilitator would ask a number of open-ended questions and give the guests a map. The pre-visit questionnaire had the guest create a personal meaning map and the post-visit questionnaire had the guest place stickers on a poster board of a map of the museum to mark what areas they visited.

They found that most of their guests are white, female, between the ages of 18-35, are in adult groups, from out of state, and have not been to the MoPOP before. The most common way that visitors heard of MoPOP was through word of mouth. The majority of their guests were experience seekers, citing "something to do" as their main reason for visiting the museum.

Monday, April 29, 2019

Review of Summative Evaluation: Climate Change and Resiliency

Objective
The researchers from RK&A were tasked to find if the programs funded by the NOAA at the Science Museum of Virginia had been successful in achieving the goals set by the museum for each program. the programs included a video, event, program, and lecture series.
Methods
RK&A observed each of the programs and interviewed 52 participants and 5 staff members.
Conclusions
RK&a found that the programs did have the intended effect of the participants. The video was very good at outlining complex topics and the event did a stellar job at raising awareness. The lecture series was successful in delivering data forward presentations, but some of the lectures were more academic than general guests could understand.
RK&A noted that the series could work to lessen the feeling of hopelessness often associated with climate change discussion.

Summative Evaluation of Design Build Fly / Sophia Rowen

Rowen, Sophia 
4/29/19 
Summative Evaluation of Design Build Fly 
Prepared for Exploration Place in Wichita, Kansas 
By RK&A [impact planning, evaluation, audience research]

http://www.informalscience.org/summative-evaluation-design-build-fly-exhibition-and-program-series
http://www.informalscience.org/sites/default/files/2018_RKA_ExplorationPlace_DesignBuildFly_Summative_final.pdf

Goals 
            The goal of the summative evaluation was to determine the successes and challenges of the Design Build Fly exhibition and programs as compared to Exploration Place’s intentions. The objectives of the summative evaluation were to:
1.    Understand visitor’s motivations for visiting the exhibition and participating programs
2.    Identify how visitors are using the exhibition space overall and what programs they participate in 
3.    Explore what meaning visitors make from their experience and to what extent visitors’ meaning-making aligns with the exhibitions and programs intended outcomes 
Methods 
            Two methods were employed: timing and tracking observations and in-depth interviews between March and May 2018. Timing and tracking observations consisted of seeing which components visitors use, for how long, and how visitors behave. They randomly selected visitors who entered the exhibition and used the method of imagining a line that the visitor would cross. The observations were unobtrusive. The in-depth interviews were open-ended and encouraged interviewees to express their opinions, understandings, and meanings. Participants in this evaluation were asked what the most enjoyable aspect of the exhibition was, the least enjoyable aspect, the confusing aspects (note: they have signage), what ideas they took away about exhibition, what processes they took away, and if they had a desire to learn about careers in STEM. 

Findings 
            They included visitor background characteristics, gallery contexts (what components of the exhibit were down or unavailable during observation), time spent, where visitors entered the exhibition, total number of stops throughout exhibition, stops and time spent by section, behaviors, social behaviors (conversing aloud, pointing). 
            They found that the exhibition and programs measured up against the intended outcomes they had for visitors. They found from the interviews that one of their intended outcomes (visitors from Wichita feeling pride in how the aviation industry has shaped their family members’ lives) was not met. They found that a lot of their visitors were not Wichita residents so meeting this achievement was limited. They learned that perhaps bolstering visits to the Wichita stories component of the ground floor could help drive up feelings of pride. 

Something interesting 

They conducted another set of interviews strictly with adolescents that had attended the Aviation if Awesome family event (this event allowed them to interact with the same exhibition that was being evaluated). They asked them what their motivations for attending the event was and categorized them into five categories: something fun, interest in aviation and industry, school invitation, someone else wanted to come, and an interest in Design Build Fly (exhibition). 

They have different audience outcomes based on different audiences. For example, they categorized their audiences into 1. Children on school field trips (3rd-8thgrade), 2. Families (casual walk-in visitors), 3. Out-of-school time adolescents, 4. Aviation professionals (current and retired). This reminded me of our age specific learning outcomes in the curriculum we have in the Innovation Workshop. 


This evaluation can be compared to staff at MOXI interviewing guests as they exit and observing guests on a weekend in a specific track. One cool thing that RK&A did was they developed an observation/reflection guide and trained their staff to use it to conduct their own formative evaluation of the programs in Exploration Place. 

Field Museum - Visitor Experiences at Program Carts

Summarize its goals, methods, and findings.  What interested you in this study? 

http://www.informalscience.org/sites/default/files/RKA%20FieldMuseum%20ScienceHubDiscoverySquad%20Summative%20Report.pdf

I was drawn to this study because it seemed relevant to MOXI.  The museum evaluated two of its program cart-type experiences and what value they provided to visitors.  The first experience is a "hub" - a quieter area off the main hall of the museum that is always facilitated by "welcoming and enthusiastic" volunteers.  The second experience is a mobile discovery cart that is also facilitated by volunteers.

"The goals of the study are to explore the extent to which visitors interact with programming in the Science Hub and at Discovery Squad Carts and the nature of those interactions, as well as visitor motivations and takeaways."

For the study, they conducted exit interviews and timed/tracked visitors.  They describe the methodology as "unobtrusive."  They were very precise about the time spent at the exhibits and the specific behaviors exhibited, like touching an exhibit, reading a label, or taking a picture.  They used the data to look for correlations between specific behaviors and demographics.

Despite looking for specific answers/causes/correlations in the data, the findings were more general in confirming the purpose of the activities.  "We conclude that Discovery Squad Carts are welcoming and.... successful in priming visitors to be curious and ask questions."  Based on the exit interviews they found that the carts provide "potentially tremendously valuable orienting experiences."  This is encouraging because I think that MOXI shares the goal of having its carts be a welcoming and priming activity.

The findings led the Field Museum to make suggestions for improvement.  A major finding was that visitors enjoyed the carts because they were "able to ask questions of someone more knowledgeable than themselves."  I think that this is an expectation at MOXI's carts (at least from adults).  They expect to be taught some science!  At the Field Museum, a finding was that sometimes the facilitators seemed nervous or unconfident so that they should train them more to be better at public speaking and providing answers.  I think we could alter this for MOXI so that Sparks are more confident in open-ended facilitation.  I think that it's important to be confident in our "non-answers."

Summative Evaluation of Portal to the Public Programs at Five Science Museums - Sam S.

My capstone project is related to an existing program called Portal to the Public developed by the Pacific Science Center in Seattle, which partners informal education teams with professional scientists to create programs for science outreach. I searched the evaluations and found their summative evaluation for their pilot program, which concluded in 2009. Pacific Science Center contracted with an evaluation consulting firm to conduct the evaluation. They were trying to answer three main questions:

1) To what extent and in what ways were the PoP guiding framework, materials, and approaches implemented and adopted at the five sites?
2) What factors affected implementation and adoption?
3) To what extent and in what ways was the PoP approach effective in A. Building partnerships with scientists and science-based organizations? B. Providing professional development to scientists? C. Communicating current science to museum visitors?

The evaluation consisted of surveys of three populations: informal science educators, participating scientists, and groups of general public museum visitors. They also collected data through passive observation and in-depth interviews. They were able to survey 13 informal educators, 38 scientists, and 16 visiting groups which consisted of 69 museum visitors.

On the preparation end of the survey for the informal educators and scientists, they found that formal experience in professional development and development of public programs helped with implementing the program, as well as robust existing relationships between institutions and scientists. They also found that participating scientists who attended professional development workshops associated with the program were much more understanding of the overall mission and more prepared to conduct science outreach.

All five institutions reported that finding partner scientists took longer than they anticipated (I've already found this myself). For scientists that did participate, the primary motivation they cited was to "communicate work and raise public awareness of science" with the second most being to "encourage young people to enter science, technology, engineering, and math (STEM) fields."

On the museum visitor side, the evaluators found a positive response. They noted that the design of each scientist's teaching tools, as well as their vocabulary, greatly affected how accessible their science was to adults or to children. Many visitors expressed enthusiasm and reported learning about new science careers about which they were previously unaware. Most notably, those programs with dynamic interactions between scientists and visitors, and those based around hands on materials, seemed to induce the most learning.

Saturday, April 27, 2019

Responce to Angela - Destiny

Hey Angela,

I just had a question for you about the research being done at the Brookfield Zoo. Do you think the 25+ minute interview was effective? I am thinking of myself on a day out and being stopped for a satisfaction survey or interview. I would not want it to go anymore then five minutes before I would feel smothered or want to leave. Especially if they are interviewing parents, under the assumption that this zoo, like most, is family friendly. Do you think they would have gotten different results if they did a different method?

other institution evaluation- destiny

Summarize its goals, methods, and findings.  What interested you in this study? 

Pixar teamed up with the Museum of Science, Boston and Science Museum Exhibit Collaboration to for The Science Behind Pixar exhibition. They did some evaluations on the exhibition to figure out some information from their guests. They used observations, pre-, post-, and follow up surveys, and even post- interviews as their data collection methods. They figured out their audience was what was anticipated for their target audience, the impact the exhibition had on their audience, and it supported a larger range of guest interactions. 
I was interested in this study because I love Pixar and Disney and would love to go to see any sort of exhibition they have about their work. 

Friday, April 26, 2019

Monterey Bay Aquarium Conservation Messages Front-End Evaluation - Juliana


          The goal of this front-end evaluation done at the Monterey Bay Aquarium was to better understand their visitors’ awareness and engagement in conservation related issues, and how they respond to conservation messages. They hoped to answer the following questions:
·      Are visitors aware of issues affecting the ocean and how they can help?
·      What motivates visitors to take actions to help the ocean? What actions are considered easier or more difficult than others? What actions are more or less impactful?
·      How do messages in the aquarium affect visitors’ attitudes towards conservation?
Two different methods were used in order to answer these questions. The first was a Conservation Action Card Sort, which helped to determine which conservation actions visitors participated in and why, whether they believe these actions are easy or difficult, as well as the impact visitors believe these actions have. The second method was a Concept Map and Message Framing, which gave more information on visitors’ awareness of ocean conservation actions, as well as visitors’ responses to conservation messaging.
            For the first question, they found that participants had a high awareness of plastic pollution as one of the largest threats to the ocean, with many participants also mentioning climate change as a major threat to the ocean. More participants were aware of actions they could take related to plastic pollution, while participants had a more difficult time coming up with and engaging in actions related to climate change. For the third question, most participants’ responses were positive regarding conservation messaging about plastic use and sustainable seafood. Some participants mentioned that they feel sad about issues facing the ocean, and feeling negatively towards animals in captivity. Lastly, participants believe that “If we don’t do something soon, the damage being done to the ocean may be irreparable,” was the most impactful conservation message, as it added a sense of urgency to the situation. Many participants responded that they expected to learn how to help the ocean during their visit to the aquarium.
I was interested in this evaluation because I am interested in marine biology and I hope to get more involved in ocean conservation issues as an informal educator. I was also curious to learn about what visitors’ attitudes are towards aquariums because I know that many people feel negatively towards zoos and aquariums for keeping animals in captivity, even though a lot of times the mission of those institutions is conservation and protecting these animals. I was happy to learn that many of the aquarium’s visitors are open and even expect to learn more about ocean conservation issues and what actions they can take. A further addition to this evaluation that I think would be interesting would be adding a question similar to what we did this past week at MOXI, to determine whether the main motivation of guests visiting the aquarium is learning about conservation, or something else.

Wednesday, April 24, 2019

Long Term Evaluation of Interactive Exhibits at Brookfield Zoo

The Bird Discovery Point (BDP) at the Brookfield Zoo was comprised of six interactive exhibits with the goal to promote family interaction, increase visitor appreciation for birds, increase use of under-utilized space and increase knowledge of birds by creating an entertaining introduction to ornithology. This paper described issues in each evaluation effort, lessons that were learned and how the BDP evaluations impacted decision making processes at the zoo. The methods used in this evaluation study were varied. Traffic patterns, open-ended interviews, surveys, and both formative and summative evaluations were performed. Open-ended interviews were 25 minutes not taking into account the fact that visitors may interrupt. Making sure to have both an interviewer and a docent present was essential in order to complete the interviews without interruption. A series of prototypes were performed in order to design an exhibit that is functional and extracts the goal message. Throughout prototyping the evaluators learned that it is essential to have a large sample size when performing observational data of over 400 individuals, in the space visitors are confused by signage that has language that is poetic or involves the use of imagery, and that it was essential to gather more than one form of data at the interactive exhibit. Overall, the creation of the BDP exhibits and the evaluation over the four year period had a positive effect on the development of the zoo and future exhibits.

I was interested in this evaluation because creating informal, interactive exhibits in an environment centered around ecology and animal conservation is something that I would like to do in the future and this paper gave insight on understanding how to evaluate not only the exhibit but also evaluating the impacts of the surrounding people (visitors, staff, ground and maintenance, etc) and live animal exhibits.

Observations, Interviews, and Surveys/Questionnaires at MOXI

There are both benefits and drawbacks to all three methods of collecting data presented in the readings. I think in order to have a better understanding of what form to use would depend on what information the evaluator wants to get. I feel that using observation tools at MOXI would fit best in the environment since it is the least intrusive and people would not get pulled away from their experience at MOXI however through observation tools, a lot of valuable information can be lost. I feel that as a spark, I am constantly making observations, primarily through ad lib sampling, recording whatever is visible and interesting at the moment. Interviews and Questionnaires on the other hand allow for better statistical analysis and deeper thought analysis. As a spark, I also engage in informal conversation interviews encouraging guests to think about, probe, discuss and test their ideas but this can be a biased form of interviewing and gathering data is challenging. I feel that semi-structures interviews would be a beneficial data collection tool at MOXi since there is some formality on what is going to be asked but creates more of a comfortable environment than formal interviews. A combination of these methods could provide the most amount of valuable information depending on what the evaluator wants to know for example, if the evaluator wants to compare the popularity of exhibit between first time guests and members, the evaluator could do a simple observation by counting people followed by a short questionnaire. the most interesting part of this reading was the part on Personal Meaning Mapping (PMM). I am not sure how this could be implemented on the floor however I think it could be an interesting tool to use with field trips or even at program carts.

Tuesday, April 23, 2019

response to destiny and Kevin / Sophia Rowen

I chose to focus on the big cons of each data collection method, although I realize there are many pros to each method as well. In fact, after reading everyone who posted before me, I agreed with all the pros that each of you wrote so I decided not to reiterate those pros again and instead focus on the possibility of combining each of these methods.

I agree with Kevin that along with mixed methods we need mixed scopes. He says it helps make a more complete picture and I believe we can utilize all these methods of data collection at MOXI. We already observe guests as Sparks on the floor, guest services already conducts their survey questionnaires. We could easily conduct short interviews with guests as they leave MOXI and even some field study interviews of people on the street that are passing by. I believe getting data from those that have not step foot in MOXI is valuable as well. It would be beneficial to know what non-guests think about MOXI so we can appeal to that population of people.

Monday, April 22, 2019

Three data methods RESPONSE TO SOPHIA

It seems like there are only cons listed. I agree with all that is listed above and wonder about the coxed-method approach you discuss at the end. What would you mix? Would you have a root in one of the three methods and then add a few other aspects from the other two? Or simply have an even amount of aspects from all three data methods?

Comparing Data Collection Methods -kevin

You have now read about three methods of collecting data: observations, interviews, and surveys/questionnaires. 

Compare and contrast the challenges and utility of these three methods in MOXI. 

All of these methods share a similar challenge in that MOXI (and learning) is so busy and multifaceted.  It can be impossible to pin down why something is happening from just one of these methods alone.  I'm thinking about the observation that I did for the vignette.  I was watching a family of just three people, but there was so much going on.  They all share a rich interaction where they have different lines of inquiry and different goals so it can be hard to parse exactly what is happening and what's causing it.  So we make educated guesses and inferences of what we think is happening.  The same could be said for a survey or interview.  And using one method alone doesn't provide feedback on the inferences we make.  That's why the mixed-method approach is so attractive to me - it helps to paint a more complete picture.  It would've been helpful to interview the people I observed to determine what was influencing their actions.

Also when you use any of these methods alone and in a wide sweep, it can be overwhelming because you get so much data back that the dots are unconnectable.  It makes me think that the most useful data collection in MOXI would be narrowly focused.  Or in addition to using mixed methods, we need to use mixed scopes.  For example, after starting with the very broad survey of exhibits by MOXI staff, we can select something that we want to observe more closely.  Which is exactly what we're doing I guess.

C/C data collection methods / Sophia Rowen


Observations 
·     Observer could inaccurately interpret visitor behavior 

Interviews 
·     Only collecting data from those who physically come to MOXI 
·     Timing of the interview is crucial (a guest doing the guest services survey in the IWS yesterday was asked by her partner why she was doing the survey if they had only gone to one room (IWS) of the museum so far) 
·     A delay in interview could result in forgotten information about an exhibit

Surveys/ questionnaires (surveys provide an efficient method of collecting information and are a direct way to explore the thoughts and feelings of guests.) 
·     Incomplete surveys are problematic, skipping questions etc. 
·     Staff could interpret visitor responses inaccurately
·     Need to be careful of leading questions
·     Intentional and unintentional biases could result from self-report measures 


Given the many pros and cons to each data collection method, I would recommend a mixed-method approach so we can get a comprehensive view of our guests experiences, behaviors, and attitudes towards our exhibits and space at MOXI. 

Evaluation plan (formative) - Sam S.

My capstone would benefit from several evaluations, both in the formative stage, as well as summative evaluation to inform long-term projec...