Hello our valued visitor, We present you the best web solutions and high quality graphic designs with a lot of features. just login to your account and enjoy ...
Hello our valued visitor, We present you the best web solutions and high quality graphic designs with a lot of features. just login to your account and enjoy ...
News ID | Title News | Details |
---|---|---|
22,116 | DrupalEasy Podcast 233 - Ashraf Abed (Debug Academy), Drupal news update |
Ashraf Abed, founder of Debug Academy and Drupal.tv talks with Ryan about the Debug Academy's long-form Drupal training. Also, Mike and Ryan take a trip around recent events in the Drupal Community. URLs mentioned
Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher. If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page. |
22,340 | Import and map 800+ lighthouses in Drupal 9 |
Waaaaay back in 2013, I wrote a blog post about importing and mapping over 5,000 points of interest in 45 minutes using (mainly) the Feeds and Geofield modules. Before that, I had also done Drupal 6 demos of importing and displaying earthquake data. With the recent release of Drupal 9, I figured it was time for a modern take on the idea - this time using the Drupal migration system as well as (still!) Geofield. This time, for the source data, I found a .csv file of 814 lighthouses in the United States that I downloaded from POI Factory (which also appears to be a Drupal site). Starting pointFirst, start with a fresh Drupal 9.0.1 site installed using the drupal/recommended-project Composer template. Then, use Composer to require Drush and the following modules: composer require drush/drush drupal/migrate_tools drupal/migrate_source_csv drupal/migrate_plus drupal/geofield drupal/geofield_mapThen, enable the modules using drush en -y migrate_plus migrate_tools migrate_source_csv geofield geofield_map leaflet Overview of approachTo achieve the goal of importing all 814 lighthouses and displaying them on a map, we're going to import the .csv file using the migration system into a new content type that includes a Geofield configured with a formatter that displays a map (powered by Leaflet). The source data (.csv file) contains the following fields:
So, our tasks will be:
We will reuse the Drupal title and body field for the Lighthouse .csv's Name and Description fields. Then, all we need to add is a new Geofield location field for the longitude and latitude:
Next, we'll test out the new Lighthouse content type by manually creating a new node from the data in the .csv file. This will also be helpful as we configure the Geofield map field formatter (using Leaflet).
By default, a Geofield field uses the "Raw output" formatter. With Leaflet installed and enabled, we can utilize the "Leaflet map" formatter (with the default configuration options).
With this minor change, our test Lighthouse node now displays a map! Prior to writing a migration for any .csv file, it is advised to review the file to ensure it will be easy to migrate (and rollback). Two things are very important:
Column names help in mapping .csv fields to Drupal fields while a unique identifier helps with migration rollbacks. While the unique identifier can be a combination of multiple fields, I find it easiest to add my own when it makes sense. The initial .csv file looks like this (opened in a spreadsheet): In the case of the lighthouse .csv file in this example, it has neither column names nor a unique identifier field. To rectify this, open the .csv as a spreadsheet and add both. For the unique identifier field, I prefer a simple integer field. Once manually updated, it looks like this: Create the migrationIf you've never used the Drupal 8/9 migration system before it can be intimidating, but at its heart, it is basically just a tool that:
Writing your first migration is a big step, so let's get started. The first step is to create a new custom module to house the migration. First, create a new, empty web/modules/custom/ directory. Then, easily create the module's scaffolding with Drush's "generate" command: $ drush generate module Welcome to module-standard generator! ––––––––––––––––––––––––––––––––––––––– Module name: ➤ Lighthouse importer Module machine name [lighthouse_importer]: ➤ Module description [The description.]: ➤ Module for importing lighthouses from .csv file. Package [Custom]: ➤ DrupalEasy Dependencies (comma separated): ➤ migrate_plus, migrate_source_csv, geofield Would you like to create install file? [Yes]: ➤ No Would you like to create libraries.yml file? [Yes]: ➤ No Would you like to create permissions.yml file? [Yes]: ➤ No Would you like to create event subscriber? [Yes]: ➤ No Would you like to create block plugin? [Yes]: ➤ No Would you like to create a controller? [Yes]: ➤ No Would you like to create settings form? [Yes]: ➤ No The following directories and files have been created or updated: ––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––– • modules/lighthouse_importer/lighthouse_importer.info.yml • modules/lighthouse_importer/lighthouse_importer.moduleThen, let's create a new web/modules/custom/lighthosue_importer/data/ directory and move the updated .csv file into it - in my case, I named it Lighthouses-USA-updated.csv. Next, we need to create the lighthouse migration's configuration - this is done in a .yml file that will be located at web/modules/custom/lighthouse_importer/config/install/migrate_plus.migration.lighthouses.yml The resulting module's file structure looks like this: web/sites/modules/custom/lighthouse_importer/ config/ install/ migrate_plus.migration.lighthouses.yml data/ Lighthouses-USA-updated.csv lighthouse_importer.info.yml lighthouse_importer.moduleNote that the lighthouse_importer.module, created by Drush, is empty. While there are a couple of ways to create the migration configuration, we're going to leverage the Migrate Plus module. For more information about writing migrations using code or configurations, check out this blog post from UnderstandDrupal.com. One of the big hurdles of learning to write Drupal migrations is figuring out where to start. It doesn't make much sense to write the migrate_plus.migration.lighthouses.yml from scratch; most experienced migrators start with an existing migration and tailor it to their needs. In this case, we'll start with the core Drupal 7 node migration (web/core/modules/node/migrations/d7_node.yml) Let's break up the configuration of the new lighthouse migration into three parts:
Our starting point (d7_node.yml) looks like this: Let's update it to look like this: id: lighthouses label: Lighthouses source: plugin: 'csv' path: '/var/www/html/web/modules/custom/lighthouse_importer/data/Lighthouses-USA-updated.csv' ids: - ID fields: 0: name: ID label: 'Unique Id' 1: name: Lon label: 'Longitude' 2: name: Lat label: 'Latitude' 3: name: Name label: 'Name' 4: name: Description label: 'Description'The main difference is the definition of the "source". In our case, since we're using a .csv as our source data, we have to fully define it for the migration. The Migrate Source CSV module documentation is very helpful in this situation. Note that the "path" value is absolute. The "ids" section informs the migration system which field(s) is the unique identifier for each record. The "fields" section lists all of the fields in the .csv file (in order) so that they are available (via their "name") to the migration. Everything after the "process" sectionThis is often the easiest part of the migration configuration system to write. Often, we just have to define what type of entity the migration will be creating as well as any dependencies. In this example, we'll be creating nodes and we don't have any dependencies. So, the entire section looks like this: destination: plugin: entity:node The "process" sectionThis is where the magic happens - in this section we map the source data to the destination fields. The format is destination_value: source_value. As we aren't migrating data from another Drupal site, we don't need the nid nor vid fields - we'll let Drupal create new node and revision identifiers as we go. As we don't have much source data, we'll have to set several default values for some of the fields Drupal is expecting. Others we can just ignore and let Drupal set its own default values. Starting with the just the mapping from the d7_node.yaml, we can modify it to: process: langcode: plugin: default_value source: language default_value: "und" title: Name uid: plugin: default_value default_value: 1 status: plugin: default_value default_value: 1Note that we set the default language to "und" (undefined) and the default author to UID=1 and status to 1 (published). The only actual source data we're mapping to the destination (so far) is the "Name", which we are mapping to the node title. One thing that is definitely missing at this point is the "type" (content type) of node we want the migration to create. We'll add a "type" mapping to the "process" section with a default value of "lighthouse". We have three additional fields from the source data that we want to import into Drupal: longitude, latitude, and the description. Luckily, the Geofield module includes a migration processor, which allows us to provide it with the longitude and latitude values and it does the dirty work of preparing the data for the Geofield. For the Description, we'll just map it directly to the node's "body/value" field and let Drupal use the default "body/format" value ("Basic HTML"). So, the resulting process section looks like: process: langcode: plugin: default_value source: language default_value: "und" title: Name uid: plugin: default_value default_value: 1 status: plugin: default_value default_value: 1 type: plugin: default_value default_value: lighthouse field_location: plugin: geofield_latlon source: - Lat - Lon body/value: DescriptionOnce complete, enable the module using drush en -y lighthouse_importerIt is important to note that as we are creating this migration using a Migrate Plus configuration entity, the configuration in the migrate_plus.migration.lighthouses.yml is only imported into the site's "active configuration" when the module is enabled. This is often less-than-ideal as this means every time you make a change to the migration's .yml, you need to uninstall and then re-enable the module for the updated migration to be imported. The Config devel module is often used to automatically import config changes on every page load. Note that this module is normally for local use only - it should never be used in a production environment. As of the authoring of this blog post, the patch to make Config Devel compatible with Drupal 9 is RTBC. In the meantime, you can use the following to update the active config each time you make a change to your lighthouses migration configuration: drush config-delete migrate_plus.migration.lighthouses -y && drush pm-uninstall lighthouse_importer -y && drush en -y lighthouse_importer Testing and running the migrationUse the migrate-status (ms) command (provided by the Migrate Tools module) to check the status of our migration: $ drush ms lighthouses ------------------- -------------- -------- ------- ---------- ------------- --------------- Group Migration ID Status Total Imported Unprocessed Last Imported ------------------- -------------- -------- ------- ---------- ------------- --------------- Default (default) lighthouses Idle 814 0 814 ------------------- -------------- -------- ------- ---------- ------------- ---------------If everything looks okay, then let's run the first 5 rows of the migration using the migrate-import (mim) command: $ drush mim lighthouses --limit=5 [notice] Processed 5 items (5 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'Confirm the migration by viewing your new nodes of type "lighthouse"! If all looks good, run the rest of the migration by leaving out the --limit=5 bit: $ drush mim lighthouses [notice] Processed 804 items (804 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'If you don't like the results, then you can rollback the migration using "drush migrate-rollback lighthouses" (or "drush mr lighthouses"), make your changes, update the active config, and re-import. Next stepsThere's a lot more to the Drupal migration system, but hopefully this example will help instill some confidence in you for creating your own migrations. The "Leaflet Views" module (included with Leaflet) makes it easy to create a view that shows all imported lighthouses on a single map (see the image at the top of the article). Once you have the data imported, there's so much that you can do! |
22,812 | Drupal 8 entity query across (through?) an entity reference field |
If you write custom Drupal 8 (or 9) modules, then you've probably used the entity QueryInterface - which is accessible from the Drupal::entityQuery() static method. If not, then you're missing out on this incredibly useful tool that allows you to easily query a Drupal site for entities of any kind. For example, if you are looking for all nodes of type Movie that have a Release year of 2009, you can do something like this: $result = \Drupal::entityQuery('node') ->condition('type', 'movie') ->condition('field_release_year', '2009') ->execute();But what if you want to base the condition on a value of a referenced entity? Maybe you want to find all nodes of type Movie where the director of the movie was born in 1981? Assume nodes of type Movie have an entity reference field to nodes of type Director, where one of the fields on the Director content type was Birth year. It's almost as easy to write an entityQuery condition for this situation as well: $result = \Drupal::entityQuery('node') ->condition('type', 'movie') ->condition('field_director:entity:node.field_birth_year', '1981') ->execute();Note the entity:node bit in the second condition - this is what allows you to access fields in the referenced entity. |
23,602 | DrupalEasy Podcast 234 - Jess Snyder (Drupal Nonprofits), Kaleem Clarkson (Drupal Event Organizers) |
Jess Snyder joins Mike Anello to talk about how Drupal and nonprofit organizations - topics include the unique needs of nonprofits, the challenges they have with Drupal 8+, and how nonprofit folks organize and support each other. Also, Kaleem Clarkson returns to the podcast to provide an update on the Drupal Event Organizers Group. URLs mentioned
Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher. If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page. |
23,867 | Composer 2.0-alpha2 performance comparison |
One of the primary goals of the upcoming Composer 2.0 release is decreasing the memory footprint and increasing the performance of common commands. I decided to test out the performance of the second alpha release of Composer 2.0 to see how much real-world change users can expect to see. tl;dr: Composer 2.0 will be much faster. Users can expect to see up to a 2x gain in speed in composer create-project commands, up to a 10x gain in composer require commands, and a over a 2x gain in composer update commands. I have Composer installed both on Mac OS X as well as automatically via the DDEV web containers that I use for teaching and client work on a day-to-day basis. I ran four tests for three different composer commands (create-project, require, and update) - 2 tests on Mac OS X (Composer 1.10.8 and 2.0-alpha2) and 2 tests in the DDEV web container (Composer 1.10.8 and 2.0-alpha2). The Composer team has made it super-easy to test out Composer 2.0 using: composer self-update --previewTo return back to your original version: composer self-update --rollback Test methodology
The graph above displays the average of each test's three runs. The composer require command (using Composer 2.0-alpha2) is slightly faster in the DDEV web container than Mac OS X - this is a surprising result, and one that I can't explain. composer create-project drupal/recommended-project
For additional details about the upcoming Composer 2.0 release, see this slideshow by Nils Adermann (co-creator of Composer) and this article on PHP.Watch for additional details.
|
24,399 | DrupalEasy Podcast 235 - Nils Adermann (Composer co-author), DrupalCon Global chat |
Nils Adermann, co-author of the Composer project, joins Mike Anello to talk about the past, present, and future of Composer. Ryan Price and Mike chat about the chat about the DrupalCon Global chat system and the future(?) of global Drupal virtual events. URLs mentioned
We're using the machine-driven Amazon Transcribe service to provide an audio transcript of this episode. SubscribeSubscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher and YouTube. If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page. |
24,519 | entityQuery examples for everybody |
The Drupal::entityQuery method has been a staple for Drupal developers since the early days of Drupal 8. But without a dedicated drupal.org Documentation page, it can be difficult for new developers to get a really good handle on. I've been using the QueryInterface API documentation for a few years now as my go-to source for help with using entityQuery, but a couple of weeks ago I stumbled on a feature of entityQuery that made me wonder what else, if anything, I was missing. This blog post is meant to provide those new to entityQuery with some commonly used examples, as well as a request for those experienced with entityQuery to let us know what else is possible. The basicsentityQuery allows developers to query Drupal entities and fields in a SQL-like way. A Drupal site's tables and fields are never a 1-to-1 match for the site's entities and fields - entityQuery allows us to query the database as if they were 1-to-1. Much like you could write some SQL that returns all rows of table1 where field1 is 14, using entityQuery you can ask Drupal to return all entities of type "basic page" where the value of field_some_field is 14. Developers can write an entityQuery using the following method to return the QueryInterface: $query = \Drupal::entityQuery(string $entity_type_id); Example 1: Simple node entity queriesThis first example returns all nodes of type page where the value of field_some_field (an integer field) is 14. $query = \Drupal::entityQuery('node') ->condition('type', 'page') ->condition('field_some_field', 14); $results = $query->execute();This example shows that an entityQuery is built up by adding various conditions, sorts, ranges, and other qualifiers. Note that the $query methods are all chainable - each one returns the $query, so we can add multiple conditions per line. For conditions, the default operator is "=", so in both of the conditions shown above, the EntityQuery is looking for type=page and field_some_field=14. If we wanted to find all basic page nodes where field_some_field > 14, then the entityQuery would look like this: $query = \Drupal::entityQuery('node') ->condition('type', 'page') ->condition('field_some_field', 14 '>'); $results = $query->execute();Sorts and ranges can also be easily added: $query = \Drupal::entityQuery('node') ->condition('type', 'page') ->condition('field_some_field', 14 '>') ->sort('nid', ASC) ->range(0, 10); $results = $query->execute();Finally, we can save ourselves some typing by calling execute() directly on the $query: $results = \Drupal::entityQuery('node') ->condition('type', 'page') ->condition('field_some_field', 14 '>') ->sort('nid', ASC) ->range(0, 10) ->execute();The execute() method returns an array of IDs for the entities found. For node entities, this is the node ID. If this was an entityQuery of users, then the ID would be user ID. Example 2: Condition groupsWhile the first example covers many real-world use cases, another pattern that is often seen is that of condition group. Consider the use case where we want to find all users whose account was created either before the year 2010 or since January 1, 2020. In this case, we can create an oConditionGroup as part of the entityQuery, as well as adding additional conditions, sorts, and ranges: $query = \Drupal::entityQuery('user'); $group = $query ->orConditionGroup() ->condition('created', '1262304000', '<') // Jan 1, 2010 ->condition('created', '1577836800', '>'); // Jan 1, 2020 $results = $query->condition($group) ->condition('status', 1) ->sort('created', DESC) ->execute(); Example 3: Reaching into reference fieldsMy latest entityQuery() discovery is the fact that it can be used to query field values of referenced entities. This means that if you have two node entities: Event
Then you can use entityQuery to return all event nodes whose location's "venue type" is a particular value. $results = \Drupal::entityQuery('node') ->condition('type', 'event') ->condition('field_location.entity:node.field_venue_type', 'boat') ->execute();Note: I recently wrote about this in a quicktip as well. Example 4: Show me the SQLDuring DrupalCamp Asheville I presented a short mostly-unplanned session on this topic during the unconference. Thanks to Hussain Abbas, Kristen Pol, and others I learned how easy it was to see the SQL that entityQuery actually uses to return the list of entity IDs - it's really quite easy. For example, to output the SQL from the previous example, use: $query = \Drupal::entityQuery('node') ->condition('type', 'event') ->condition('field_location.entity:node.field_tag.entity:taxonomy_term', 'sailboat') ->__toString();Resulting in: SELECT base_table.vid AS vid, base_table.nid AS nid FROM node base_table INNER JOIN node_field_data node_field_data ON node_field_data.nid = base_table.nid INNER JOIN node__field_location node__field_location ON node__field_location.entity_id = base_table.nid LEFT OUTER JOIN node node ON node.nid = node__field_location.field_location_target_id INNER JOIN node__field_venue_type node__field_venue_type ON node__field_venue_type.entity_id = node.nid WHERE (node_field_data.type = 'event') AND (node__field_venue_type.field_venue_type_value = 'boat') Example 5: How deep can we go?Let's go back to example 3 - is it possible to create a condition that queries a field value of an entity that is referenced by and entity that is referenced by the entity you are querying on? Consider the use case where we want to find all events whose location has a term whose name is "sailboat". Turns out that it is: $results = \Drupal::entityQuery('node') ->condition('type', 'event') ->condition('field_location.entity:node.field_tags.entity:taxonomy_term.name', 'sailboat') ->execute(); GotchasIt is important to understand that the entityQuery will only return entities that the user has access to - access control is performed by default. If you want to disable access control, then add the following to the query: $query->accessCheck(FALSE);Did I miss anything? What else can be done with entityQuery? Feel free to leave a comment and let me know. |
25,339 | Adding non-PHP dependencies to a Composer-based project |
Over the past few years, the Drupal community has been (sometimes slowly) embracing the Composer dependency manager tool for PHP projects. We have become accustomed to adding Drupal modules and base themes to our projects using composer require but many of us have only scratched the surface of what more Composer can do for us. In this article, we'll go step-by-step in adding a non-PHP dependency to our project using Composer - as well as the helpful Composer installers extender plugin. We'll utilize Asset Packagist, a Composer repository for many popular NPM and Bower assets including things like Javascript libraries and CSS frameworks. The general idea is that for just about any Drupal module or base theme that asks you to manually download an external library - you can use this method instead of manual steps. Example 1: Simple, but not super-usefulLet's start with a simple example - adding the required Photoswipe module's library. The instructions for this module include some manual steps to download and place the plugin in a specific directory as well as an alternative installation using Composer. In this example, we'll accomplish the download part using Asset Packagist. First, add the Photoswipe Drupal module using: composer require drupal/photoswipeNext, add Asset Packagist to the repositories section of the project's composer.json file: { "type": "composer", "url": "https://asset-packagist.org" }Then, find the Photoswipe Javascript library on Asset Packagist. Finally, require the Photoswipe Javascript library using: composer require npm-asset/photoswipeSupereasy, right?! Yes, but notice that the Photoswipe library wasn't installed in the proper /web/libraries/ location. Instead, Composer installed it in its default location for dependencies, the /vendor/ directory. Because the Photoswipe library doesn't have a composer.json file, and therefore doesn't have a Composer "type", there's nothing that the Composer installers plugin can do to help. Example 2: PlacementThe Composer installers extender plugin can help with this task. It allows us to extend the types of dependencies that Composer installers can handle (by default, Composer installers only handles a specific set of dependency types. First, add the Composer installers extender plugin to your project: composer require oomphinc/composer-installers-extenderLet's back up example 1 a bit so we can not only download the Photoswipe library to our project, but also put it in the proper place. composer remove npm-asset/photoswipeNext, we need to let Composer installers extender that we want its help in handling Composer dependencies from the npm-asset vendor. We do this by adding a bit of configuration to our project composer.json's extras section: "installer-types": ["npm-asset"], This allows Composer installers extender to help Composer installers place dependencies from the npm-asset vendor in a custom directory. Let's set the custom directory by adding to the default (if you're using the drupal/recommended-project Composer template) installer-paths section of the composer.json file: "web/libraries/{$name}": [ "type:drupal-library", "npm-asset/photoswipe" ],Finally, use Composer to re-require the Photoswipe library: composer require npm-asset/photoswipeVerify that the Photoswipe Javascript library has been installed in the web/libraries/ directory. SummaryThe combination of Asset Packagist and the Composer installers extender really opens the door to allow you to manage the vast majority of all your project dependencies in a consistent manner. Use it to install Bootstrap, Foundation, the Chosen module's Javascript library, A11y, Babel, and other popular NPM and Bower packages. |
25,815 | DrupalEasy Podcast 236 - AmyJune Hineline (virtual Drupal events) |
AmyJune Hineline, community ambassador at Kanopi Studios, joins Mike Anello to talk about virtual Drupal events. URLs mentioned
We're using the machine-driven Amazon Transcribe service to provide an audio transcript of this episode. SubscribeSubscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher and YouTube. If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page. |
39,903 | 5 Reasons to get serious about your Drupal path in 2021 |
As we look forward to the vaccine-influenced future beyond COVID-19, one of the very few things we have lived over the past 9 months that may actually have a positive lasting impact on society is that often there are advantages to accomplishing things virtually. Working remotely, meeting via Zoom, online appointments, grocery delivery, even doctor visits have screamed past proof-of-concept for situations most of the world had previously not considered. So many more applications that rely on the web are more ingrained in daily life since COVID forced them on us. With this incremental leap in mass adoption of web technologies, web development (and for the purpose of this article, Drupal) will continue to play a major role as the world leverages online tools more, and for more applications. This suggests that Drupal positions (which total more than 1,600 listed on Indeed the first week of December this year), may open up even more in the future. So, for those who conceive, create, and maintain current and future tools, or people considering these careers, the future looks pretty bright. Those of us in the community know that even beyond the new growth potential, Drupal and web development has a lot to offer; so, we humbly share these 5 great reasons you may want to consider 2021 as a good time to get serious about your Drupal dexterity if you want to expand your skills, get up to speed or even pivot your career to seize the day and opportunity:
It’s hard to find an article about promising careers without web developer positions seated in the upper tiers. According to recent article by CNBC, Web Developer careers are in the Top 15 High Demand jobs over the next 5 years. Indeed puts it at #10 for careers most in demand right now; and the US Department of Labor Occupational Outlook Handbook estimates that 2019’s total 174,300 web developer positions will grow by about 14,000 over the next 10 years, a higher rate than most vocations.
If you have the desire and commitment to become a Drupal professional, you can! Drupal training is available through go your own pace options like Drupalize.me, with focused training sessions live and online (Evolving Web, Mediacurrent) and of course, might we suggest the longest running, long-form career technical education program through DrupalEasy Academy; Drupal Career Online. There is also plenty of advice out there like DrupalEasy Career Resources and How to start a Web development career.
Life in 2020 changed work for a lot of people, but remote working is nothing new in the world of web developers. Drupalists, both those employed by organizations and those who choose to freelance, have a long and successful history of working from home, or wherever they find great wifi and interesting surroundings, there are state of the art tools that support teams who are spread out specifically catering to this efficient, low-overhead way of doing business. Add “virtual” to your job search on Indeed or Career builder, and you can see hundreds of positions.
According to the U.S. Department of Labor Bureau of Labor Statistics, The median annual wage in May, 2019 for web developers was $73,760. It gets better if you choose Drupal, as Indeed reports that, based on 188 salaries reported as of November, 2020, the average salary for a drupal developer is $95,642 per year in the United States. You will have to get some experience and hone your skills, but with commitment and patience, the high-wage jobs are yours to strive for.
Come for the code, stay for the community. It’s not just a mantra. The Drupal community has groups connected by topic, interests, service to the community and even outside interests. There is nothing like it. Even here at DrupalEasy, we have a micro community of past and current students that meet up every week to help each other work through projects, issues and provide support. It’s our DrupalEasy Learning Community, and we like to think it’s a microcosm of the Greater, Global Drupal Community that makes our content management system, and the people who support it so outstanding. The next session of Drupal Career Online starts March 1, 2021, with the application deadline the last week in February. If you are interested in learning Drupal, honing your Drupal skills specifically in our longform Drupal Career Online, let us know, or sign up for one of our no-cost Taste of Drupal mini webinars coming up in the beginning of 2021.
|