Rise 360: How to add a free response survey question

Feb 10, 2020

Hello eLearning Community,

Does anyone know how to add a survey question into Rise? I created an eLearning course in Rise and wanted learners to be able to give feedback directly within the course. 

Because there is no survey option, I added an interactive knowledge check block and used a fill-in-the-blank question. The only problem is when a learner responds i.e. submits their answer, they will receive a message that says their answer is "Incorrect."

Despite writing verbiage thanking the learners for their feedback "for any response"  (as opposed to incorrect/correct responses), the feedback always says that their submission is incorrect. Is there a way to ask a question and disable whether the answer is correct/incorrect so that learners to not see correct/incorrect after they submit their responses?

 

Thank you!

 

 

12 Replies
Alyssa Gomez

Hello, Cecilia!

Thanks for letting us know you need a survey block in Rise 360. That's a popular request, and we'll send you an update if we add that feature in the future!

For now, I would suggest designing a survey in a third-party survey tool, then embed the survey seamlessly into your Rise 360 lesson using a Multimedia Embed block. Use this iframe code format for easy embedding! <iframe src="URL HERE"></iframe>

Give it a try, and let me know if you need help along the way. 😁

Sandeep Gadam

Hi Cecilia,

I suggest creating a survey question within Storyline and then Publish that particulate course to Review 360. Then in Rise create a new block by choosing Interactive, select Storyline and then click on Browse Review 360 and choose the specific survey course that you need to add within your Rise course.

Try this and let me know if this works for you.

Doris Li

I have exactly the same question as Cecilia. Could you be more specific? if possible, a step-by-step instruction will be very appreciated and helpful. I tried to embed the code format and it didn't seem to work. (I probably have done it the wrong way, hence I would love to know how to do it right)

Hannah Soini

Hi @alyssa gomez! A question for you regarding this - When using this option , and adding a interactive block to rise is there a way to actually collect the answers in the end of the course?
Just got this question and I´m not sure if this is technically possible..

At the moment I just created a free-form in storyline and added this to rise, but can change it to a survey question - althought I don´t know if that makes any difference in saving the answers in the end.

Thanks for your input! :)

John Cooper

For us, the issue of creating a survey in RISE has come up a number of times with clients. The approach we have taken is to use a Storyline block - but we quite like using sliders so our demo here does exactly that.

https://demo9.profilelearning.com

Obviously this is just the Storyline bit, but we have used this approach in RISE. The JavaScript library used is loaded dynamically (at run time) so there's no messing around trying to amend HTML after publishing. It uses a pdf form template to create the downloadable output which we include in an attachment block in the RISE course. The nice thing here is that it's easy to change the output layout without affecting the JavaScript.

Hannah -

With regard to holding the survey results until the end of the course (and, I guess, potentially combining the results of multiple survey blocks), we haven't tried that yet but we recently posted a demo of creating a Learning Journal in RISE where we used local browser storage to hold the variables collected during the course from Storyline blocks so I see no problem in using the same approach here.

It seems like a worthwhile project to include that in another demo.

AND before anyone asks - no we don't pass the results to an LMS. It's just so the learner can take the survey and download the results themselves (although we have emailed it to the course tutor in one instance).

We still need this facility building into RISE!

 

John Cooper

We recently did a project (with learner notes rather than a survey - but the principle is the same) where we captured learner input and stored it in the SCORM 2004 data structure rather than relying on local browser memory.

I would be possible to use the same approach to store survey results in SCORM memory for a compliant LMS - but that doesn't mean the LMS would report those results. There are fields in the SCORM data structure that a compliant LMS has to support PUT and GET calls to populate but the standard doesn't specify that these fields need to be included in LMS reporting.

Using xAPI where the LMS/LRS is likely to have a report generator might be a better way to go.