Migrating from Drupal 7 to Known

10 min read

What's Next?

As you can see funnymonkey.com has quite a facelift. When it was realized that FunnyMonkey would be going through a transition Bill and I reviewed what the future of funnymonkey.com would look like. Historically the reason to keep coming back has been Bill's blogging on education and education policy. So the focus would be on something that worked well as a blogging platform. The net was cast wide and we considered many options including; staying with Drupal, migrating to wordpress, laravel, revel, go, etc.

In the end we chose Known. After having met Ben Werdmüller and Erin Jo Richey at Reclaim Your Domain: The UMW Hackathon Known was already on my radar. Besides being great people to talk with and work with, Erin and Ben have a great vision for Known and a solid architecture. Known is built with the ethos of the IndieWeb movement and the POSSE publishing model. The ethos of Known and FunnyMonkey line up pretty closely.

How do we get our content into Known

Okay now we've chosen Known, we have 10 years of content currently in a Drupal 7 site, now what?

After a cursory review the import and export routines within Known appeared to be hardcoded and as far as I could tell not pluggable. That's a minor disappointment (more on this later). At this point it looked like a custom plugin was the way forward. Known plugins are pretty straightforward and looking at the default ones proved to be quite helpful. For instance take a look at Bridgy's Main.php file (found under IdnoPlugins);


    namespace IdnoPlugins\Bridgy {
        use Idno\Common\Plugin;
        class Main extends Plugin {
            function registerPages() {
                \Idno\Core\site()->template()->extendTemplate('account/menu/items', 'bridgy/menu');
                \Idno\Core\site()->addPageHandler('account/bridgy/?','IdnoPlugins\Bridgy\Pages\Account');
            }
        }
    }

That's it for the minimal plugin, just register some pages and templates. Past that there is an expected directory structure where Known will find the registered page handlers and templates. Again, reviewing Bridgy;


Bridgy/=
├── Main.php
├── Pages
│  └── Account.php
├── plugin.ini
└── templates
    └── default
        └── bridgy
            ├── account.tpl.php
            ├── facebook.tpl.php
            ├── menu.tpl.php
            └── twitter.tpl.php

We see that the call to \Idno\Core\site()->addPageHandler() registers a page handler for account/bridgy located in the PHP file IdnoPlugins\Bridgy\Pages\Account. That's the basic structure. I'm covering Bridgy for a couple reasons;

  1. It's simple: It doesn't take much code to constitute a plugin in Known.
  2. It's included: The code I'm about to show you is my first Known code and is largely oneoff since it is a migration and will not have an ongoing use. So using Bridgy is a bit more illuminating as it's fair to say it is likely ideomatic Known code.

Writing a content migration plugin

Caveat: This is not exemplary code and can be improved in many ways. What it does show you is how easy it is to get content from other systems into Known. There many points worth considering for refactoring, such as storing the new ID to old ID association as the content is imported and not outside of the save routine(s). That said, you can find the code we used over here.

I'm going to defer the detailed points of the code with the hopes that the code is commented well enough and easy enough to read. This will instead focus on the overview of the process.

Assumption

  1. The drupal DB will be available during the import routines. For this we just backed up the FunnyMonkey.com db and restored locally on our developement stack.
  2. The drupal files directory will be available during the import routines. These were just rsync'd from the production site into /srv/www/legacy/files.
  3. The migration will proceed in the following order as depicted by dependencies;
    1. Files: Have no requirements
    2. User: Have user profile pictures and require Files
    3. Nodes: Have authors and files associated and thus require the Files and User imports
    4. Comments: Require nodes
  4. The source content is in MySQL
  5. URL rewrites will be created to map all content
  6. Some method to check old content and new content will be necessary for quality checking

Writing our plugin

Registering pages


function registerPages() {
    // Administration page
    \Idno\Core\site()->addPageHandler('admin/drupalmigration','\IdnoPlugins\DrupalMigration\Pages\Admin');
    \Idno\Core\site()->addPageHandler('admin/drupalmigration/users','\IdnoPlugins\DrupalMigration\Pages\User');
    \Idno\Core\site()->addPageHandler('admin/drupalmigration/nodes','\IdnoPlugins\DrupalMigration\Pages\Node');
    \Idno\Core\site()->addPageHandler('admin/drupalmigration/files','\IdnoPlugins\DrupalMigration\Pages\File');
    \Idno\Core\site()->addPageHandler('admin/drupalmigration/comments','\IdnoPlugins\DrupalMigration\Pages\Comment');
    \Idno\Core\site()->template()->extendTemplate('admin/menu/items','admin/drupalmigration/menu');
}

In order, we register pages for the following details;

  1. Admin page: this will be our overview where we set our database settings. Arguably this could be omitted and just hardcoded.
  2. User Page: This will be the overview for user import.
  3. Node Page: This will be the overview for node import.
  4. File Page: This will be the overview for file import.
  5. Comment Page: This will be the overview for the comment import.

Then we register a template extension to get our 'DrupalMigration' into the menu. This is just a snippet that extends the existing menu to include our options for the DrupalMigration. Review the contents of

DrupalMigration/templates/default/admin/drupalmigration/menu.tpl.php

to see how this injects our menu options into the default menu.

Implementing a page

I'm only going to cover the process for the File portion as that is our first page and is exemplary of the process for all the other pages (excluding the overview page where the db settings are input). The framework for this file is the following;


  namespace IdnoPlugins\DrupalMigration\Pages {
    class File extends \Idno\Common\Page {
        function getContent() {
        }

        function postContent() {
        }
    }
}

We extend \Idno\Common\Page and implement two processes, one for a GET request and one for a POST request. In the file's getContent() method we ensure that only admins can access this page via $this->adminGatekeeper(); then we proceed to build out some tabular data to give an overview of the files to be imported and their status. We store ongoing migration data inside of Known's site config. Arguably we should have used an external table to manage this and would be especially necessary for larger migrations. The filemap which tracks files we have already imported is stored in \Idno\Core\site()->config()->drupal_migration_file_map. Most of this code consists of building up a data structure which we then pass to our admin/file template.

You can review the template in DrupalMigration/templates/default/admin/file.tpl.php. Again this should be better architected to do more of the logic work inside the getContent() process so that the template is just iterating and outputting and not doing any calculations. That said, our template does do a bit of work to present some URL rewrites for those files that have been migrated so that we can include those in our .htaccess after the migration.

For the postContent()> method we again ensure the user is an admin and then iterate over the files and use our plugin classes methods to handle all of the heavy lifting of getting the files into Known. After we process all of the files we redirect via $this->forward(\Idno\Core\site()->config()->getDisplayURL() . 'admin/drupalmigration/files'); back to the same page so the user can see the results.

Additional details

Hopefully everything so far has been helpful. The code could be used as a starting point for other Drupal site migrations into Known. The constants at the top of the file will need adjustment to appropriately grab your content. Assuming you use the same field names for the SQL queries the rest of the import code should largely work. Outside of those constants at the top the following methods will likely need review & refactoring to meet your needs;

  • getFiles(): This currently includes a bunch of unmanaged files and dummies them up to match the managed files data structure. The list of unmanaged files that should migrate will vary from site to site.
  • addUser(): Hardcodes adding a couple users as admins. This could be omitted. All user accounts have mangled passwords between 68 and 127 in length. The idea here is to require users to set a new password via Known's password reset process
  • rewriteURL(): Can be modified to clean up any garbage content and normalize URLs into one particular format. We opted to switch to relative rather than absolute links so that testing would work fine when we were not on the funnymonkey.com domain. This could also be extended to support rewriting node references to other nodes as well, but we opted to defer to 301 (moved permanently) redirects.
  • rewriteContentLinks(): We rewrite content references using our rewriteURL() process so that we can map files to their new destination and normalize on the same process for all content.

Taxonomy is handled by mapping to hashtags appended to the end of the content. See addNode() for more details.

URL rewrites

In addition to each step in the migration rendering a list of rewrites at the bottom of the import screen, Drupal also uses url_aliases that we need to account for.

The following SQL does that for us, we omit all url_aliases that are not users or nodes.


SELECT CONCAT('RewriteRule "^', alias, '$" "', source, '" [L,R=301]') FROM url_alias where source like '%user%' OR source like '%node%';

Points for improvement in Known

Overall the experience with Known was fantastic and a very refreshing experience working with a system with such a tightly focused use case and quality implementation. That said, the following details were points that I saw as potential opportunities for improvement.

Modular import/export process
Arguably this can be better handled with custom code like we did. However, having a modular import/export process lowers a barrier to collaborate and get content into Known. Perhaps the import/export functionality should itself be a Known plugin. In fairness what is currently there handles other platforms that have a standardized export process, and that's a good first step. Besides, Drupal is far from being in a place to have a standard export routine across various implementations. For Drupal there could be a standard views export template that you can map your content into a views export and then a generic Drupal to Known importer that imports data formatted in a particular as defined by the views template, but that's a Drupal project.
AddAnnotation() doesn't return ID
The other processes and methods for saving other Known content all return the newly created ID when creating new objects. This is really a minor nitpick but it made checking the import routine a bit haphazard and prevented a one-to-one on the URL rewrites. In our case we opted to rewrite to the source document rather than the specific comment. While this loses the direct link it does not break the link in the event anybody had linked to the site externally.

Upgrading to Drupal 7.20 and Fixing Broken Image Paths

2 min read

On a project that is still in development, we recently did a core upgrade as part of our pre-launch preparations. The project involved a data migration of tens of thousands of nodes and users; part of the migration involved manual cleanup of image data to account for a responsive design in the new site and different use of screen real estate between the old site and the new site.

However, as is noted on the 7.20 Release Notes, there are some issues that can arise on sites using the Insert module after upgrading to Drupal 7.20. The diff on the release notes page for the 7.20 release gives a sense of how the ramifications of these issues are still evolving; given that there are a little over 42,000 sites reporting use of Insert in D7, there will likely be other people affected by issues similar to what we experienced.

What We Experienced

After we upgraded, none of the images that had been inserted into text areas via the Insert module were displaying. The images were still present in the file system, but given that they were derivatives created via imagecache, it would have taken a single flush of imagecache, or a modification to an image style, and poof! the stored version is gone, making the pre-existing path to the image even more useless.

As is noted in this issue, older versions of the Insert module don't work cleanly with Drupal 7.20. As several commenters posted in that issue, all relative links no longer worked. This issue led to the creation of a sandbox project that begins to address the issue. However, cleaning up pre-existing content is heavily dependent on the site in which that content was created or edited.

Next Steps

After researching the issue, Jeff developed an update script that we tested in dev prior to rolling live on staging.

As mentioned above, the specifics of updating content will vary based on your precise site configuration, so the chances of this script working cleanly on your site with no modification is virtually nil.

However, in the interest of saving the next people to face this issue some time, the script is on GitHub; please fork away and have at it.

, ,

Vagrant and Puppet for Drupal development

6 min read

History

We here at FunnyMonkey have been using virtualized development environments for several years. Fortunately for us this has all been with VirtualBox, so making the transition to vagrant made perfect sense. Along with this transition we had previously been using a set of shell scripts to configure our environments exactly the same. While this process was certainly adequate and a major improvement upon XAMP/MAMP/etc, it still left a bit to be desired. Vagrant combined with puppet improves the shortcomings in our previous approach.

What is vagrant?

Vagrant bills itself as "Virtualized development made easy." and this is a fairly accurate self-assessment. From our experience vagrant is an improvement upon just a straight virtualbox development environment. So far, in our limited use, vagrant solves several problems that were issues requiring extensive documentation or manual workarounds for each deployment. With vagrant we solve several issues;

  1. What is our starting point? Vagrant allows pre-defined boxes to be your configurable starting point. These can have all sorts of options and configuration pre-baked.
  2. How do we handle host specific issues? Specifically how do we handle the myriad of OS and hardware issues?
    • How many CPUs or cores should be associated with this virtual environment?
    • How much RAM should this virtual machine use?
    • How do I get an active routable internet address on this host?
    • Are there any other specific workarounds that need to be accounted for?
    These issues as well as quite a few others are addressed via the Vagrantfile; with a well defined hierarchy it is easy to provide an inheritance structure with the local defaults trumping any values that may have been set in the originating box definition, and project specific settings trumping all other values.
  3. How do we ensure the box is configured as our project needs it? Vagrant kicks off either chef, puppet, or even custom shell scripts to handle project specific configuration. The rest of this documentation will be assuming the use of puppet.

What is puppet?

Puppet is a configuration management tool. What that means is that puppet ensures that the system it runs on is configured in exactly the manner in which the puppet configuration defines. In large scale production or cloud computing environments this is absolutely critical as it ensures that each machine is configured identically. In development environments this is valuable for several reasons;

  1. In deployments requiring special configuration the development environment can be configured precisely the same as the production environment
  2. Development environments can be unique per client project. Rather than overloading a XAMP/MAMP installation with a sub-directories or virtual host per project each development environment can be standalone.
  3. The time required to deploy a development environment is a small fixed cost and is quite predictable.
  4. Development environments can have the exact packages and configuration they need. Does your current project require the latest PHP, how about compass, or ApacheSolr? Using a virtualized environment allows you to pick the best in class OS and package management for your particular project, and allow that configuration to be used by other members of your team just by using the same development platform.

Why should I care?

Reproducability.

Using defined development environments that mimic or are identical to production allow every team member to have meaningful conversations about real issues rather than being mired in OS specific or configuration specific details. That is, it lets development team members focus on the real problems rather than;

  • How much memory is PHP configured to use?
  • What version of X are you using?

Most importantly it let's us assume that everyone is starting at point A. This means that in terms of a specific project each team member has the same directions to get from point A to point B. We no longer need a completely unscalable set of instructions on first how to get to point A for each team member. Instead we can focus on the needs of our project.

How do I get started?

You can follow the vagrant getting started instructions here. We have put the scripts we have developed on Github; feel free to grab them and fork them. If you follow these steps through "Install Vagrant" you can then clone our git repository from github which contains some bash scripts and a vagrant with puppet configuration designed for Drupal development.

  1. Install vagrant
  2. git clone git@github.com:FunnyMonkey/fm-vagrant.git
  3. cd fm-vagrant
  4. run ./build.sh This creates the Vagrantfile and nodes.pp
  5. vagrant up
  6. start using your virtual environment.

There is additional documentation included in the README on github.

Questions

Why not just pre-define a box for Drupal development and remove puppet?

While you could certainly do this we feel that the best approach is to configure on the fly. This provides several benefits;

  1. You can have a consistent basic configuration that every project uses. This affords a certain amount of familiarity in ensuring that each project has a similar starting point.
  2. Each project can have exactly the components it needs rather than every component that every project could need.
  3. A pre-baked box or configuration is decent in a static environment. Over time security exploits will be discovered and corresponding fixes will be implemented. Occasionally these security fixes will create incompatabilities with a projects' configured features. As such a project needs to grow and evolve in a similar manner to how its parent OS and configuration grows and adjusts.

Why not grab Drupal as part of the install?

While you can certainly do this, puppet will reset any changes that you make if you do not make the changes via the puppet configuration.

What does this workflow look like?

Once you have the basics set up you just clone the vagrant repository for each project, and then 'vagrant up' a new environment.

I prefer to connect over ssh to the virtualbox as that way everything is self contained in the virtual box. You could also setup shared folders that would map to your host OS if that workflow is better for you or your OS & editor combination does not support editing files over ssh. From here your regular workflow should take effect.

As always if you notice any errors, inaccuracies, or ways in which this approach could be improved please chime in.

, , ,

Building Drupal Style Tiles using Foundation and SCSS

5 min read

This past weekend, Pacific Northwest Drupal Summit in Seattle, WA, was buzzing with talk of Foundation, an amazing responsive web framework built with HTML5 and SCSS.

I led a session on Building Drupal Style Tiles using Foundation and SCSS and provided sample code for attendees to download and install to create their own style tiles.

Original video is available on YouTube.

What are Style Tiles?

Style Tiles are not a new concept in design. Graphic and interior designers often use Mood Boards or similar deliverables to develop aesthetics before beginning design. Everyone can now style tile-ize whatever they want - Just sign up for Pinterest and start pinning and you too can create a palette for your new living room or create a palette of cute baby animals. (Why I really love Pinterest)

Style Tiles for the web were made popular by Samantha Warren of Twitter. She has developed the site styletil.es which discusses the value that style tiles have in the design process and provides a Photoshop template download so you can create your own tiles. She has written a great article on A List Apart discussing the style tile process.

I have used Photoshop style tiles in multiple projects and have found them to be an invaluable step in the design process. By creating style tiles, a designer is able to flesh out multiple palettes, typography choices and design patterns without having to take all the time it takes to design an entire homepage. I especially love creating the different patterns and only giving the user a "sneak-peek" of these patterns. Once the client chooses the option with which they want to move forward, the style tile process makes it easy to duplicate the Photoshop layer patterns and begin the homepage design.

One thing I have removed from my style tiles is the descriptive adjectives. I have found that they are distracting to the clients and that often, the clients focus more on the words than the design.

Creating Style Tiles Using HTML5 and SCSS

This summer while sitting on an airplane and designing a style tile in Photoshop, I thought to myself:

"Wouldn't it be cool if I had a style tile built in HTML that I can just plug colors and other stylistic elements into SCSS variables and create an interactive style tile?"

And…

"Wouldn't it be cool if I could build style tiles that use Drupal HTML code that I can style and then reuse code and mixins in my site's theme? "

And...

"Wouldn't it be cool if those style tiles were responsive so that the client could get a feel for how elements change in different devices?"

At the same time, I had just discovered Zurb's Foundation web framework and was totally swooning over it. I still am.

I immediately closed Photoshop, downloaded Foundation and started coding.

What is Foundation?

So what is Foundation? Foundation is, and I quote Zurb on this (and totally agree), "The most advanced responsive front-end framework in the world."

Foundation is written in HTML5 and SCSS to provide a responsive open-source code set that developers can download and use for prototyping or site building.

Foundation comes with a huge list of features. Some features are:

  • 12-column typographic grid designed to work on almost any sized device screen
  • Typography based on a golden ratio modular scale
  • Multiple button styles, patterns and sizes plus hover and active states
  • Custom form styles with validation states and checkbox/radio styles
  • Multiple navigation styles with drop down support and configurable for responsiveness
  • Responsive image or content slider
  • Modal Dialog

Whoa. So great!

Building the Style Tiles in Foundation

Download Style Tile Code

I use Foundation to create the two-column responsive layout. I have created two different directories. One is called base, the other is called example. Each style tile shows and defines the style for the following elements: typography, swatches, link styles, button styles, tabbed navigation, tags, pager and status/warning/error messages. I have included app.scss and settings.scss and modifed specific settings in settings.scss to define colors and fonts.

I created tile.scss to provide the layout and the swatch mixin for the style tile. I have also included an empty overrides.scss for any new styles.

Base Style Tile

The base style tile shows Foundation in action using its default colors and styles.

In the example style tile, I have removed the Foundation classes on HTML elements and I added styles to the overrides.scss. These styles and mixins are intended to be the code that a front-end developer can use in their Drupal theme.

How can these style tiles be used?

During the presentation at Pacific Northwest Drupal Summit, the group and I discussed the different potential usages of this code. Straightforward, these style tiles can replace Photoshop based style tiles. While the designer may still do a bit of work in Photoshop, the presentation of the style tile will be in HTML.

We thought that this could be a tool to help facilitate communication between designers and front-end developers to help clarify how the design is carried out into interaction. This tool could also serve as a client deliverable to show how fonts will render, allow the designer to showcase different webfonts and demonstrate how specific elements will change with user interaction.

Potentially, as a Drupal-specific project, we can access the Styleguide module through hook_styleguide() and hook_styleguide_alter() to modify the Style guide output so that it contains style tile elements such as color swatches.

I would love to hear any other ideas people have. Please download, fork and add bugs or questions to the github queue if you have any issues.

, , , , ,

Using Drupal In Education Unconference in Portland - Save the Date

3 min read

At DrupalCon Denver, we had an extremely successful Education Unconference and we're planning to do it again in Portland. The planning for this event is in the very early stages; we have nailed down a date - Monday, May 20, 2013 - and are in the process of securing a venue. The event will take place in Portland, Oregon, and as soon as we finalize the location, we will update the announcement page with details. One thing to note at the outset, though, is that we want to make this event an opportunity for people familiar with Drupal and people familiar with education to come together and discuss common issues. If you work in either one of these areas, please come - we all have a lot to learn from one another, and with one another.

Based on discussions in Denver (and within the Drupal community over the years), several general areas of interest continue to emerge. Some likely topics could include:

  • Large scale deployments, and how to balance the needs of individuals/units within an organization against maintaining a reasonably standardized platform;
  • Responsive web design;
  • Strategies for mobile web applications;
  • Ensuring accessibility within web sites;
  • Supporting communities of practice;
  • Drupal as a traditional LMS;
  • Using Drupal to support informal and inquiry-driven learning.

Admission is free of charge; just register here! The event will follow an unconference format, so if there is something you want to talk about, propose a topic, find some like-minded individuals, and let the conversation start.

As with the Denver unconference, we have similar high-level goals:

  • Facilitate connections between people working in the education space who would not have the opportunity to interact within the larger venue of DrupalCon;
  • Generate conversations among people working in different areas of education; in this way, K12 folks could talk to Higher Ed, people working in Libraries could talk to other stakeholders, etc - while there are many differences in what we do, there are also similarities, and it would be good to see some opportunities for collaboration materialize;
  • Set the stage for more focused BoFs at DrupalCon - rather than spend the first BoF of DrupalCon figuring out who wants to talk about what, we could lay the groundwork at the unconference for ongoing conversations throughout DrupalCon;
  • Discuss development methodologies and best practices that are making our lives easier, and more productive;
  • Demonstrate and discuss example sites, and talk about how we built great sites that help people learn more effectively;
  • Your idea here: https://2013.drupalpdx.org/forum/education-unconference-planning

So, mark May 20, 2013, on your calendars. The second Using Drupal in Education Unconference is on! We look forward to seeing you there.

, ,

Drupal Presentation Notes, and the Role of Open Source in Mainstream Ed Tech

3 min read

For the last few days, I was down in San Diego, California for the 2012 ISTE conference. I was down there running a session for people to learn about Drupal. I have added the notes I used for my presentation into the handbook in the Tutorials section.

During the conference, I wandered onto the vendor floor to touch base with some friends who were working in different booths. Once I recovered from the shock of a bright orange gimp-man shilling hardware, I was struck by the overwhelming lack of any open source representation. Aside from a single Moodle vendor, I didn't see any open source representation.

Orange Man

This paucity is all the more striking because of the amazing, innovative work I see happening within education using open source tools. On a very regular basis, I see schools using a range of open source tools to support curriculum mapping, online classes, collaborative projects, community outreach, professional development, portfolios - and in these cases, schools aren't paying exorbitant fees to vendors, or losing control of the work performed by teachers and students, or ceding flexibility for convenience. They are just working - intelligently, intentionally, making mindful progress towards articulated goals, and using open source tools to support and extend that work.

But this narrative seemed largely missing at ISTE - possibly, this is due to the company I keep, as I tend to gravitate more toward people who are doing the day to day work in the classroom. But from visiting the vendor floor, the story of educational technology - at least this year - seemed to be one of convenience and speed over vision and ongoing effort. Technology, at least the vision of it being articulated and sold on the vendor floor - is the panacea. It is the thing that makes the difficult easy, and makes all of us smarter.

Open source has a role to play in articulating a different narrative about education - a narrative that values individual effort within a community that is loosely united toward a common goal. The development model within open source communities (and this model has been in place and thriving well before the days of Web 2.0-ifying everything) has always supported (in general terms) peer to peer learning and support. The absence of open source companies in the larger mainstream educational technology world is a loss for both open source and educational technology.

For an additional perspective on the state of data control and access to data, Audrey Watters has a piece over at Hack Education on how vendors responded to questions about data portability and apis. She was aided on the quest by Kin Lane, who knows a thing or three about APIs.

, , ,

Drupal in Education Unconference 2012

3 min read

On Monday, March 19th, at Del Pueblo School in Denver, Colorado, around 75 Drupalists and educators met for a day of sessions focused on the needs of people working in education, and how Drupal can help.

Sessions ran all day, and some of the topics included how to manage hundreds of sites within an organization, responsive design best practices, how to use distributions within education, and how to ensure that sites are accessible.

One of our goals in planning this event was to carve out the space and time for people working within education to have substantive conversations with other practitioners, and to increase the communication between people working in different areas of educational systems. At a technical level, there are overlaps between some of the core issues people are addressing, regardless of whether they work in K12, Higher Ed, Government, in the classroom, or as part of administrative support. Philosophically, if we look at education as a process that unfolds continuously across people's lives, people within different levels of education can benefit from knowing more about how their counterparts are solving problems, and the rationales behind the systems they put in place. One of the things that struck me yesterday, as I talked with different people at the event, was the skill, talent, and focus of the people who came. I feel fortunate to have had the opportunity to sit in a room full of smart, talented people and listen to how they are doing their work.

In the conversations, there was also also a common thread of education as a social justice issue. The notion that a user interface should be evaluated from the perspective of how it empowers people to the need for multilingual translations (and how to best achieve them) were a couple of the ideas that I'll be thinking about for a while.

I'd like to thank the participants who came and made the day happen. And, if you were at the event and want to keep in touch with the other participants, please add your name to the wiki.

As mentioned earlier, the event was held at Del Pueblo School, and the space was generously made available to us by the Denver Public School System. Michael Wacker and Glenn Moses were instrumental in making this connection.

Also, get ready to mark your calendars. We will be organizing this event next year; we will announce the dates for the next event shortly after the dates for DrupalCon 2013 are announced.

, ,

Solving Problems and Finding Solutions in Education: A Panel Discussion at the Drupal in Education Unconference

2 min read

As part of our preparations for the upcoming Education Unconference taking place on March 19th in Denver we are happy to give an update on the panel discussion.

Register for Drupal in Education Unconference in Denver, CO on Eventbrite

The participants will include:

  • Jason Hoekstra - Jason is the Technology Solutions Advisor at the US Department of Education. As part of his work in the Department, Jason is working on the Learning Registry, a system to support improved sharing and collaboration among people creating and using online content for learning.
  • Bud Hunt - Bud is an Instructional Technology Coordinator for the St. Vrain Valley School District in northern Colorado. Prior to becoming an Instructional Technologist, Bud taught English. He has been blogging about technology, writing, learning, and learning online since before there was an internet.
  • Bryan Ollendyke - Bryan works in the e-Learning Institute at Penn State as an Instructional Web Technologist. Bryan has been a leading advocate for Drupal within higher education, and is the main developer of ELMS, a Drupal-based learning and instructional design platform.
  • Glenn Moses - Glenn is the Director of Blended Learning at Denver Public Schools. Glenn has spent over a decade designing and working in blended learning environments, and helped build the largest blended learning program in the state of Nevada.
  • Michael Wacker - Michael is the Online Professional Development Coordinator at Denver Public Schools. Michael designs and facilitates online learning spaces for educators to inquire, share, reflect, and connect.

The panel will be moderated by Bill Fitzgerald; Bill worked in K12 education for 16 years prior to starting FunnyMonkey, an open source development shop that works primarily with education and non-profit organizations.

The panel discussion will start by focusing on the professional needs of people working at different levels within different types of educational systems, and what tools have helped them meet those needs.

The Unconference is free, and takes place on March 19th, in Denver, Colorado. See you there!

, ,

Drupal in Education Unconference

2 min read

On Monday, March 19th, we are organizing a Drupal in Education unconference in Denver; the event will be held at Del Pueblo School. This meetup will follow an unconference format, so if there is something you want to talk about, propose a topic, find some like-minded individuals, and let the conversation start.

The event is free, and attendance at the event is capped at 150 people. To attend the unconference, please sign up here. If we get more than 150 attendees, we will start a waiting list. Please sign up only if you are certain you will be attending.

Register for Drupal in Education Unconference in Denver, CO  on Eventbrite

I'd like to thank and recognize Michael Wacker and the Denver Public School System for allowing us to hold the conference in their space. Also, Melissa Anderson has provided invaluable organizational work to help get this unconference moving.

Schedule

  • 9:30 to 10: Arrive, brainstorm sessions
  • 10 to 11:30: Session 1
  • 11:30 to 1: Lunch/Ongoing Conversations There are several good food options near Del Pueblo. We are also seeing if we can arrange to have a food cart come to the school to provide another option for people to buy lunch.
  • 1 to 2: Session 2: Panel Discussion (see details below)
  • 2 to 3: Session 3

We have set up a wiki page on groups.drupal.org for session ideas; if there is a subject you want to discuss, put it on the wiki.

Additionally, if there is interest, we can reconvene at a restaurant/bar later in the day. Location TBD.

Panel Info

The panel brings together people working at different levels within educational systems. The panel includes practioners working in K12, Higher Ed, and the US Department of Education. Within the panel discussion, the focus will range from what the needs are (described in a technology-agnostic way) and what technological developments have proven most useful at meeting these needs.

We are still finalizing the participants of the panel; look for a follow-up announcement to be coming within the next couple days!

Getting There

For those people driving, on-site parking is limited.

Once you get to the venue, please enter through the West side Galapago doors. Other doors to the building are generally locked.

An Incomplete History of Sexism In Drupal

2 min read

So here's the thing. I'm proud to be a member of this community. The Drupal community contains some of the smartest, kindest, most generous people I have ever had the privilege of meeting. I don't think I'm overstating anything to say that my involvement in the Drupal community has altered the trajectory of my life. I have learned more in my six years of work within and around this community than at any other period of my life. It's an amazing place.

And that is why I'd like to see us do better. We can always rationalize away the things we don't like, or find ways to justify things that are distasteful.

But if you see something that feels wrong, stop and ask questions. Realize that you will need to have the same conversation, repeatedly. Realize that people will get mad at you and blame you for bringing it up. Realize that in the process of having conversations, you will learn about things that you misunderstand as well. But don't stop having the conversation, because that's how change occurs: one awkward, uncomfortable, unwanted conversation at a time.

, ,