Zaphne – Content Automation News for 06-03-2018

Part 3, The Deployment Wizard

vRA 6.x’s implementation involves a series of appliance deployments, VAMI configurations, prerequisite headaches, and installation of several IaaS components on windows hosts. While the end-to-end implementation of vRA has come a long way, there was still a lot to be desired. Continuing with the theme of redefining the user experience, vRA 7’s new deployment wizard takes time-to-value to a whole new level. Starting with a single 5GB OVA download, admins log in to the Virtual Appliance Management Interface and are immediately presented with the new Deployment Wizard UI. The wizard will provide a choice of a minimal or enterprise deployment then, based on the desired deployment type, walks the admin through a series of configuration details needed for the various working parts of vRA, including all the windows-based IaaS components and dependencies. 

For HA deployments, all the core components are automatically configured and made highly-available based on these inputs. Check Part 2 of this series for more details on the deployment architecture. In both Minimal and Enterprise deployments, the IaaS components are automatically pushed to available windows servers made available to the installer thanks to the management agent. Host roles are based on desired placement by the admin via the wizard. The IaaS Prerequisite Checker has also been totally incorporated into the wizard, allowing the installer to check for prereqs on the IaaS components and determine whether or not they have been properly prepared based on the selected server roles. 

In a POC deployment, time from VAMI login to completely implemented is clocked at somewhere around 25 mins in an environment with adequate resources. For an HA deployment, end-to-end config will be under a couple of hours. The goal here was to not only improve the end-to-end deployment and associated user experience, but also to give customers the ability to quickly deploy and test vRA 7 with minimal time investment. 

Keywords: [“deployment”,”vRA”,”wizard”]
Source: http://www.virtualjad.com/2015/10/vrealize-automation-7-part-3-the…

SCAP

Beginning with the January 2018 Quarterly Release, DISA will publish updated benchmarks using the Security Content Automation Protocol, version 1.2. Migration to the SCAP 1.2 standard started with the recent release of the Windows Server 2016 Benchmark and will continue with the forthcoming release of the Red Hat Enterprise Linux 7 Benchmark. SCAP 1.2 introduces new capabilities for automated assessments through its updated component languages, providing more flexibility in developing new content. Some of these capabilities, listed below, may be utilized in future DISA Benchmark updates or new releases. The Open Vulnerability and Assessment Language, version 5.10, adds support for Windows PowerShell cmdlets, shared resource effective rights tests, and shared resource audited permissions tests. 

OVAL 5.10 improves support for Linux RPM verification. OVAL 5.10 also adds last-logon checks to Windows and UNIX/Linux checks. The Common Platform Enumeration, version 2.3, includes an applicability language that gives the benchmark the ability to determine whether a particular STIG Rule applies to the system being evaluated. This facility has allowed the Windows Server 2016 Benchmark to be published as a single benchmark, with domain-controller and member-server checks being evaluated only as necessary. DISA continues validation testing of SCAP 1.2 content with recent versions of HBSS/ePO/Policy Auditor, SPAWAR SCC, and ACAS. 

Though the content will be published as a ZIP file, ePO requires that the contents of the ZIP be extracted and then imported, rather than the ZIP file itself. As SCAP 1.2 releases of benchmarks are posted, previous SCAP 1.1 releases will be removed from IASE. To prepare for SCAP 1.2 content, please ensure your organization is using the current STIG tools and automation content available from IASE.​​. 

Keywords: [“Benchmark”,”Content”,”Release”]
Source: http://iase.disa.mil/stigs/scap

Accessibility testing

Web accessibility testing is a subset of usability testing where the users under consideration have disabilities that affect how they use the web. In order to be fair to all, governments and other organizations try to adhere to various web accessibility standards, such as the US federal government’s Section 508 legislation and the W3C’s Web Content Accessibility Guidelines. As the person evaluating accessibility, it is your role to raise additional accessibility concerns, as you are the subject expert. If you’re testing against German BITV 1.0 Level 2, the Italian Stanca Act, or the WCAG 2.0 draft, the only current option is the experimental ATRC Web Accessibility Checker. There is no such thing as fully automated accessibility testing. 

On the desktop-level side of things, OS X comes with Accessibility Inspector and Accessibility Verifier. Accerciser is available for the GNOME assistive technology-SPI API. Tools for poking at theHTML document object model include DOM Inspectors as seen in Opera Dragonfly and Firebug and accessibility tool bundles like the Web Accessibility Toolbar for Internet Explorer and Opera and the ICITA Firefox Accessibility Toolbar. Inspecting what is exposed to the desktop-level accessibility structures is important for checking what plugin content is being exposed to assistive technology that uses those accessibility models. Using a mouthstick to press keys while testing keyboard accessibility. 

Testing for equivalents synchronized with multimedia, such as captions and audio descriptions, can be done by digging into the preferences for your media player to turn on accessibility settings. Accessibility inspection tools like the Firefox Accessibility Extension can make such tasks easier by, for example, listing the headings on the page, or listing the attributes of form fields. If you are testing a video sharing site for accessibility, do not begin by asking them if they can use particular controls. 

Keywords: [“accessibility”,”Test”,”users”]
Source: https://www.w3.org/wiki/Accessibility_testing

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.