Put Your Blog on Cruise Control with Automated Content
There’s no getting around it: generating material for your blog or website is difficult, time-consuming work-which is why there are so many people and companies out there developing solutions for automating content. Before we talk about those solutions it’s worth considering what automated content can and can’t do for you and your business. Automated content is usually too general and too derivative to do this. Content creation’s role in SEO is pretty obvious: the more content you generate containing the keywords your customers are searching for, the better your chances of being found by them. Even if your site visitors aren’t reading much of your content, knowing it’s there will make them more likely to trust you and continue studying what you have to offer.
There are a number of applications out there you can use to produce blog content. One step up from simple curation tools are services like BlogSense and Linguastat, which modify or transform existing content into a somewhat more original and therefore more valuable product. These applications also provide deep integration with content management platforms like WordPress, so they can automatically publish content to your blog without requiring you to reformat it. If you’re looking for automated video content, you might consider an application like Qwiki, which generates interactive multimedia presentations based on existing written and visual content on a given subject. Qwiki is in the process of shifting its focus to mobile applications, making it even more useful for real estate professionals looking to create great video content in the field and on the fly.
One is to work directly with a content marketing firm, which will help you determine your content parameters, then send them off to the writers, graphic designers, etc. Automated content can certainly give you a boost in search engine results and free up valuable time and money in your marketing budget.
Continuous integration and deployment automation
Our Process at Branded3 currently works like this…. we have 4 environments; continuous delivery, internal UAT, external UAT and production. Code is checked into github which includes the Sitecore root and our code. We then use team city to build the solution and run any gulp tasks for SAAS/js minification and finally run octopack to package up all the files into what is basically a nuget package. We then use octopus deploy to automatically push the code out to continuous delivery.
When we get to a milestone build, where we are considering pushing out to production we will use octopus deploy to push the package to internal UAT for in house testing, then to external UAT for client testing. We are using unicorn for item serialisation and these files get published into the package, when we deploy we manually run the sync command. When the build gets pushed to each environment we run a post deploy powershell script which copies over any environment specific configs from a folder included in the package and repoints the IIS directory to the new location. We also have one project which uses the Sitecore Azure module and for that we use octopus deploy the same way but the production environment pushes only to the admin server then we log in once the release is complete and use the controls in Sitecore to deploy to the CD environment although soon i will be replacing this with our own powershell deployment so that it can all be done in one step by octopus deploy. The biggest gotcha I have found is with the way Sitecore does upgrades, it is totally at odds with the modern deployment mechanism so if you want to upgrade to a new version of sitecore you have to run the upgrade process on each environment individually and then push the updated build, otherwise you don’t get item changes.
The only other solution is to serialize the entire core and master database but that isn’t a good idea.
Radiologik Radio Automation Software for Mac
Radio automation software actually designed for radio Radiologik is a system suitable for both live DJing and 24/7 radio automation on the Mac that uses iTunes as its database and iTunes playlists as the logical building blocks for sophisticated programming. Radiologik was developed for and is used in LPFMs, NCE-FM, college and high school stations, and online stations. Fully automated and unmanned stations use Radiologik to pick content by date, intro and outro artists and titles, announce the time, station ID, play podcasts, manage and play advertising by a separate schedule that integrates with the programming schedule, all completely autonomously. Radiologik DJ can be used by itself as a DJ program for live events. It is also the player Radiologik Scheduler uses to make a full-time automated radio station.
Professional transitionsUse Track Prep in the free Radiologik Scheduler Basic to analyze your tracks ahead of time for best radio transitions. Radiologik Scheduler Download hereYou can use the scheduler to run a 24/7 automated terrestrial or internet station. Radiologik Scheduler Basic is Free with DJ. It supports picks and fills from iTunes playlists with time instructions, artist separation, unique track checking, and best fit exact time searching for top of the hour placement for station IDs or other arbitrary times. Radiologik Scheduler Advanced is a mode for scheduler which further supports voiceover intros and outros for specific tracks, artists, albums.
Learn by videoTo learn more about Radiologik DJ and Scheduler, visit the Radiologik video site for 3 1/2 hour’s worth of overviews and comprehensive tutorials. Meet other Radiologik usersVisit the independently run Radiologik Users Forum to exchange tips, get peer help, and discuss radio and Mac issues with other Radiologik users. Listen to an exampleRadiologik Tranceis a simple example station that uses Radiologik.