Comparative Evaluation of Lynda dot com and Atomic Learning

min read

Key Takeaways

  • On completion of a pilot evaluating Lynda.com (reported in another article in this publication), the Lethbridge College reviewer then compared their existing e-learning platform to Lynda.
  • The combined methodologies used to carry out the pilot project and the comparative evaluation complement the goals outlined in the Rapid Assessment of Information Technology (RAIT) approach.
  • A pilot project followed by a comparative evaluation may yield insights that amount to more than the sum of their individual parts.

Drawing on key practices from a technology assessment model referred to as Rapid Assessment of Information Technology (RAIT), developed by Caroline Sinkinson, Mark Werner, and Diane Sieber,1 a semester-long pilot of Lynda.com gathered feedback from students, instructors, and nonacademic staff at Lethbridge College. (See "Evaluation of Lynda.com at Lethbridge College" in this publication for an explanation of the pilot.) Feedback indicated both demand and support for Lynda on campus. Less clear, however, was why the participants perceived Lynda as superior to our existing solution, Atomic Learning. A sizeable difference in the license fees provided the impetus for conducting a comparative evaluation.

Lynda and Atomic Learning each maintain a vast catalog of instructional content, predominantly in the form of video-based instruction. A review of vendor documentation indicates that both can contribute to the learning needs of higher education, through development of technology skills, business skills, digital literacy, college and career readiness, and creative skills. How does one go about evaluating such platforms so as to inform decisions related to purchasing or renewal of licenses? This project presents a range of methods.

Methodology

The comparative evaluation of Atomic Learning and Lynda centered on reviewing platform functionality and ease of use, production value, and instructional quality, in addition to exploring course offerings in areas related to business, software, technology, and creative skills.

Functionality and Ease of Use

User experience provided a lens to explore platform functionality. User experience is defined as an approach that takes into consideration users, their needs, values, abilities, and also their limitations. Four questions focused data collection:

  1. Does the site have useful information?
  2. Is the site accessible for all learners?
  3. Is the site easy to use?
  4. Does the site promote flexible methods of access?

Production Value

"Production value" is defined here as instructional content that looks and sounds professional. For review I selected three courses from the top-10 list of courses viewed by participants during the Lynda pilot; then I identified similar courses using the "search" feature in Atomic Learning. In total, I reviewed 18 instructional videos, nine on each platform, at their highest resolution in both the optional full-screen mode and the native in-screen mode. I reviewed the first lesson in each course and randomly selected other videos in the lesson/course for evaluation. Time constraints prohibited analysis of all videos within each of the courses (the three selected Lynda courses contained 128 videos; the three Atomic Learning courses contained 79 videos).

After identifying evaluative criteria, I assigned weightings to provide some measure of consistency during the review. Criteria and assigned weightings included:

  • Instructor presence (2 points)
  • Audio quality (1 point)
  • Video quality (1 point)
  • Script quality (1 point)
  • Unity of design (1 point)

Reviewing the first video in each of the courses determined whether the instructor provided some type of initial greeting and /or corresponding ending — a basic, yet important technique given the parallel in the face-to-face classroom, for instance when an instructor welcomes a student to the class. Branding elements such as a company logo were also considered an indicator of professionalism, although not scored. I considered unity of design to be important because each course encompasses many individual learning resources. The maximum score per course was six points. The scores for each course were aggregated by platform to provide a final score that could be used as a basis for comparison with respect to production value.

Instructional Quality

Investigating instructional quality entailed a content analysis of video transcripts. "Content analysis" is defined here as an analytic approach that uses predetermined categories to check for patterns, frequency, and relationships.2 I adapted the screencasting framework developed by William Sugar, Abbie Brown, and Kenneth Luterbach,3 resulting in identification of five focus areas and a series of indicators:

  • Focus 1: Provide overview
    Indicators: Establishes introduction, provides background information and rationale, and makes connections to future lessons
  • Focus 2: Describe procedure
    Indicators: Explains procedures
  • Focus 3: Present concept
    Indicators: Explains concept, provides examples
  • Focus 4: Focus attention
    Indicators: Focuses attention, provides concluding remarks
  • Focus 5: Elaborate on content
    Indicators: Provides enrichment (contextual references and advice)

I revisited the same three courses evaluated as part of the production review process, selecting different videos from each one. In total, I reviewed six transcripts, three for each platform. The combined word count was 1,679 words (10 minutes, 47 seconds of video) for Atomic Learning and 2,350 words (12 minutes, 50 seconds) for Lynda.

Lynda transcripts were copied/pasted for review into a word processor; a professional formatter was asked to transcribe the narration in the Atomic Learning videos because transcripts are not provided. As sole reviewer, I evaluated the transcripts to identify whether indicators were present a minimum of one time.

Results

On completion of my evaluations of the courses, I organized the results into the following summary.

Functionality and Ease of Use

1. Does the site have useful information?
Atomic Learning has a catalog of 265 courses (see the Atomic Learning Higher Education Overview Video). The total number of videos is not provided. Lynda has 5,625 courses and 233,056 videos.

  • Developer: 413 courses; 17,128 videos
  • Design: 610 courses; 28,518 videos
  • IT: 237 courses; 8,716 videos
  • Video: 530 courses; 20,712 videos
  • Web: 1189 courses; 48,512 videos
  • Photography: 555 courses; 25,830 videos
  • Education: 155 courses; 6,396 videos
  • CAD: 186 courses; 7,151 videos
  • Business: 1,284 courses; 51,122 videos
  •  3D+Animation: 282 courses; 12,332 videos
  • Audio + Music: 184 courses; 6,639 videos

2. Is the site accessible for all learners?
Review of accessibility features focused specifically on captions and transcripts as shown in table 1.

Table 1. Accessibility features for Lynda and Atomic Learning

Criteria

Lynda

Atomic Learning

Presence of closed captions

Yes

Yes

Placement of captions

Captions appear below the interface. Screen recording is not obstructed.

Captions appear over the screen recording, resulting in some minor obstruction of the software interface being presented.

Turn captions on/off

Yes

Yes

Transcripts provided

Yes; transcript text is highlighted with automatic scrolling, with an option to toggle on/off. Transcripts can be copied/pasted into a word processor for additional review or modification.

Transcripts not provided.

 

3. Is the site easy to use?
The player and course interfaces on Atomic Learning and Lynda were reviewed to identify their ease of use as shown in tables 2 and 3.

Table 2. Player interface functionality

Feature

Lynda

Atomic Learning

Dim background

Yes

No

Pop-out video

Yes

No

Full-screen toggle

Yes

Yes

Playback speed

Yes (seven options)

No

Rewind 10 seconds

Yes

No

Rewind 10 seconds

320p-to-720p HD

480p-to-720p HQ

Note-taking

Viewer can take notes while listening to video. Notes exportable as .docx, .pdf, .txt, Google doc, and Evernote.

No

Table 3. Course interface functionality

Feature

Lynda

Atomic Learning

Course overview

Yes

Yes

Course duration

Yes

Yes

Course feedback

Yes

Yes, but not on course interface page

Ability level ratings

Yes

No

Suggested courses to watch next

Yes

Yes, but not on course interface page

Playlist/Add to favorites

Yes

Yes

Share playlist

Yes (public URL, e-mail link, social media)

Yes (generate easy links HTML to paste in web)

Search

Yes; two search options available, including in course search and search across platform

Yes; one search option, across platform

4. Does the site promote flexible methods of access?
Four methods of access were explored: offline, mobile, remote authentication, and learning management system (LMS) integration, as shown in table 4.

Table 4. Accessing Lynda and Atomic Learning

Feature

Lynda

Atomic Learning

Offline access

Yes; desktop app available for OS X and Windows. Courses and/or individual videos can be downloaded for offline viewing.

Not available

Mobile access

Yes; download courses and individual videos to iOS and Android using the Lynda app. Activity syncs across devices, e.g., viewing history. Optional-Mobile website is also available for smartphones.

No; discontinued support/development of iOS app because website is responsive to fit any screen.

Authentication

Yes; central authentication service (CAS) was tested during the pilot. Also Single sign-on (SSO) protocol for web-based authentication. E-mail verification lets users create a Lynda account using organization-provided e-mail address with specific e-mail domain. Other methods (not tested): integration of Lynda videos into your systems. All integration methods explained here.

Yes, but not tested. Current method used at Lethbridge College is Internet Protocol (IP) address authentication. Other methods explained here.

LMS integration

Yes; learning tools interoperability (LTI) integration of course and playlist files

Yes; learning tools interoperability (LTI) integration and Easy Links

Production Value

Three courses selected for review on each platform had a total of 18 instructional videos, nine on each platform. One lesson from each platform follows for illustrative purposes.

Atomic Learning

Course 1 (55 videos): "SPSS 22 Basics Training"

Findings: Review of the first video in the SPSS course, "What you'll learn in this training" (42 seconds) begins with a static view of the SPSS software interface. No audio or branded visuals are used to focus attention. No personal introduction or course welcome is provided. A course overview is provided, with a static image of the SPSS user interface. Instructor presence = 0/2 points.

Review of two additional videos: The videos are narrated screen recordings (screencasts) that provide instruction about the SPSS software interface. Audio is clear, and on-screen actions (video) are visible. Native in-screen resolution is crisp; resolution in full-screen mode is slightly less so, with menu items on the SPPS interface becoming slightly blurred. The script seems concise. The videos share many common elements, including a similar beginning and ending, pacing of delivery, and consistency in use of instructional method. Note: This course does not contain downloadable exercise files. Audio = 1/1 point, video = 1/1 point, script = 1/1 point, unity = 1/1 point.

Total lesson score = 4/6.

Lynda.com

Course 1 (59 videos): "SPSS Statistics Essential Training"

Findings: Review of the "Welcome video" (1 minute 17 seconds) reveals audio and a branded visual to cue learners. Video is used for a personal introduction and a course welcome. The lesson concludes with video of the instructor and an invitation to begin the course. Instructor presence = 2/2 points.

Review of two additional videos: The course uses narrated screen recordings to provide instruction about the SPSS software interface. Audio is clear, and on-screen actions are visible. Native in-screen and full-screen resolution for the screen recordings are crisp. The script seems concise. The three videos share many common elements, including a similar beginning and ending, pacing of delivery, and consistency in use of instructional method. This course is accompanied by downloadable exercise files provided as part of the SPSS software package. Audio = 1/1 point, video = 1/1 point, script = 1/1 point, unity = 1/1 point.

Total lesson score 6/6.

Instructional Quality

Investigation of instructional quality relied on a content analysis of video transcripts. In total, I reviewed six transcripts, three for each platform. The combined word count was 1,679 words (10 minutes, 47 seconds of video) for Atomic Learning, and 2,350 words (12 minutes, 50 seconds) for Lynda. A snapshot of transcripts including markup is provided for illustrative purposes.

Atomic Learning

Course 1: "SPSS 22 Basics Training." Duration: 1 hour, 42 minutes. The series was developed by an independent e-learning developer. Video reviewed: "Defining a Variable Type" (2 minutes, 14 seconds).

Review: Not much of an introduction (e.g., "In this lesson, you will…"). There is a lack of rationale (e.g., "and this is important because"). The procedure and related sub-procedures seem to be explained well. The concept (numeric vs. string data) is explained effectively. There is no connection to future lessons (e.g., "in the next lesson we will"). There is little in the way of contextual references, and the concluding remarks are weak.

Sample transcript:

You can enter multiple data types of variables into SPSS data sets, including dates and dollar amounts (Background information). But the most common are numeric and string types of data. Numeric data basically consists of numbers, whereas string can consist of characters as well as numbers (Background information) (Present concept).

In this data set here (Focus), what I want to do is add a new variable to my data set (Explain procedure). What I want to is add it between "ID" and "Course". So, click on Course (Explicit narration), and I can either right click on the mouse and select "Insert Variable," or go up to "Edit," click on it, and select "Insert Variable" (Explicit narration and options) Either way, once you select Insert Variable, a new Variable column will appear. You will notice that SPSS has given this variable a default value of VAR00008 (Focus attention).

Lynda

Course 1: "SPSS Statistics Essential Training." Duration: 4 hours, 57 minutes. Videos are accompanied by exercise files provided as part of the SPSS software package. Video reviewed: "Recoding with Automatic Recode" (2 minutes, 46 seconds). The series was developed by Lynda.

Review: The introduction makes connections to a previous lesson and provides background information to distinguish the concept of "automatic recode," the topic of the lesson. The procedure is explained and examples are provided. The instructor focuses attention, establishes a rationale, provides advice, and makes simple concluding remarks.

Sample transcript:

You may have noticed earlier when we were recoding variables, that when we went to the transform menu, underneath the recode into same and different variables was an alluring alternative, and that's the automatic recode (Background information). Now, automatic recode works a little differently from some of these others (Focus attention). What automatic recode does is, it's not recoding the cases, it's not ranking them. What it's doing is its ranking the value labels (Present concept, introduction). So, for instance in this data set here there are over 6,000 cases.

If we were ranking the cases, we could get ranks one through 6,000, 6,401, actually. But if we're ranking the values, you'll only get as many different ranks as there are values (Examples). So, for instance, let me show you this one (Introduction to procedure). Let's take age. So we have age in years right here. We're going to put it over here (Implicit narration). And I'm going to have to give it a new name, because it automatically recodes into a new variable, which is a smart thing to do. I'm just going to call it, "Age Rank". Now, add a new name. (Explicit narration)

Results

  1. The breadth of offerings in the Lynda course catalog are more comprehensive when compared to Atomic Learning.
  2. Captions on both platforms are adequate. Atomic Learning places the captions atop the screen recording, resulting in a minimal obstruction of the user interface in software demonstrations. No transcripts are provided on the Atomic Learning platform, which presents barriers for learners who require accommodations or viewers who prefer to read rather than listen.
  3. The player interface on Lynda has more functionality, making it possible for the user to customize the viewing experience to a greater extent than in Atomic Learning. The "in course" search feature on Lynda, which makes it possible for a user to search within a video, and the ability for integrated note-taking are not present in Atomic Learning.
  4. Lynda provides more flexible methods of access, including offline viewing.
  5. The total combined production value score for Lynda — 18/18 — is higher than that for Atomic Learning — 11.5/18. Demonstrating instructor presence (e.g., personal introduction and course welcome) and consistency with respect to audio, video, script quality, and unity across lessons contributed to a higher score for Lynda.
  6. Each of the lessons reviewed on the Lynda platform was accompanied by downloadable exercise files that the learner could use while watching the videos. No such files were present for the Atomic Learning lessons. There was variability across lessons on both platforms in terms of key elements such as providing a lesson rationale or concluding remarks. One Atomic Learning lesson contained musical accompaniment. From an instructional design viewpoint, the purpose is unclear. Another lesson provided a link to a resource on a third-party website. The learner could not click on the link, but rather had to write it down and then type it into a web browser. For a course on time management, this seemed not only counterintuitive but also resulted in less time spent on task.

Discussion

The Lynda.com pilot brought into focus the diverse learning needs of students, instructors, and nonacademic staff and established the need for and benefit of access to a flexible, on-demand learning solution. The comparative evaluation revealed that Lynda has some essential features that set it apart from Atomic Learning.

First, the Lynda course catalog is substantially larger. This is a strength; simply put, the larger the catalog, the more likely you are to meet audience needs, not to mention others advantages, for instance, when an online catalog can serve up additional resources based on a viewer's previous history.

Second, Lynda's catalog contains instructional resources of superior production value to those of Atomic Learning. Given the success and reach of open, online video platforms such as YouTube, most people can now intuitively distinguish between videos of high or low production value. The YouTube phenomenon makes a trade-off visible: Relying on many different authors creates opportunities to leverage scale, yet it poses a challenge to maintaining quality. The high marks that Lynda received for production value are the result of bringing subject matter experts onsite to their production facilities to author instructional resources. Atomic Learning, in contrast, sources its instructional content in a more flexible manner, letting their instructors develop their own instructional resources or link to third party sites. The result is varying levels of production value.

Reviewing videos to ascertain instructional quality proved somewhat inconclusive, as the presence of key indicators varied across lessons. However, investigation into functionality and ease of use revealed that the Lynda platform sets itself apart with other indicators related to instructional quality:

  • First, it provides both transcripts and offline access, making it possible for learners to choose when, where, and how they learn.
  • Second, Lynda makes available downloadable exercise files, enabling students to shift from simply watching to actually doing during the lesson.
  • Lastly, functionality on the course and/or player interface, such as "in-course" search and integrated note-taking, provide the learner with more options to customize their learning.

Project Limitations

The pilot was three months in duration and had a limited project scope. For instance, functionality and ease of use for the desktop and mobile versions of Lynda and the browser-based experience of Atomic Learning on mobile devices were not reviewed, and the accessibility review was cursory. One focus area identified, though not investigated in detail, was the administrative reporting capabilities, for instance the ability to generate custom reports. Lastly, both the production review and instructional quality review entailed evaluating a small subset of videos within each of the courses, and they reflect the perspective of only one reviewer. The instructional quality review was limited to content analysis of transcripts and could be strengthened by a review of both the transcripts and the multimedia elements in each lesson. Worth noting is that both platforms continue to evolve through the addition of new features.

Key Findings

  1. The Lynda course catalog is substantially larger when compared to the Atomic Learning course catalog. The larger the catalog, the more likely a service provider can meet audience needs.
  2. Lynda's catalog contains instructional resources of superior production value.
  3. Reviewing lessons for instructional quality was somewhat inconclusive, as the presence of key indicators varied across all lessons.
  4. The Lynda platform sets itself apart with other indicators of instructional quality.
    1. First, it provides both transcripts and offline access, making it possible for learners to choose when, where, and how they learn.
    2. Second, Lynda makes available downloadable exercise files, thereby enabling students to shift from simply watching to actually "doing" as they watch the lesson.
    3. Lastly, functionality on the course and/or player interface, such as "in-course" search and integrated note-taking, provide the learner with more options to customize their learning.

Conclusion

The Lynda.com and Atomic Learning platforms were reviewed to identify relative strengths and weaknesses. Results from the comparative evaluation indicate that the Lynda platform is better positioned to meet the needs of Lethbridge College, though extra features bring with them a higher cost. Lynda has a larger and more comprehensive course catalog, and the instructional resources demonstrated a higher production value, scoring high marks for audio, video, script quality, and unity of design across lessons. Finally, when looking at the platforms though a learning-centered lens, the Lynda platform combines resources and platform functionality, including transcripts, downloadable exercise files, and in-course search, enabling students to become more active (not passive viewers of video) in the learning process.

Notes

  1. Caroline Sinkinson, Mark Werner, and Diane Sieber, "RAIT: A Balanced Approach to Evaluating Educational Technologies," EDUCAUSE Review, June 16, 2014.
  2. P. Liamputtong, Qualitative Research Methods, 4th edition (South Melbourne: Oxford University Press, 2013).
  3. William Sugar, Abbie Brown, Kenneth Luterbach, "Examining the Anatomy of a Screencast: Uncovering Common Elements and Instructional Strategies," International Review of Research in Open and Distance Learning, Vol. 11, No. 3 (October 2010).

Andy Benoit is manager of Instructional Technology at Lethbridge College.

© 2016 Andy Benoit. This EDUCAUSE Review article is licensed under Creative Commons BY-NC 4.0 International.