Enrich your SharePoint Content with Intelligence and Automation
Across Office 365 and SharePoint we have a great tradition of taking technologies that required specialized expertise and making them mainstream. This morning at SharePoint Conference 2018 we introduced Microsoft AI in SharePoint with cognitive services, data management and analytics. These aren’t just for developers any more – they’re for everyone, thanks to SharePoint. Millions of users rely on SharePoint lists every day to keep track of critical business data. From T-shirt sizes to issue management to public health programs, SharePoint lists can do it all.
This year, SharePoint lists will get even better – easier to build, easier to edit, and easier to share and analyze. Add file upload to Microsoft Forms – Add a custom question to allow users to supply a file to upload to SharePoint. We’re making it easier to build new SharePoint lists – by copying from other lists, by importing form Excel, or by reusing published templates for common data used in your organization. Edit in placeYou can move data from Excel to SharePoint and keep the same ease of use. Power BI IntegrationWith so much critical, real-time data in SharePoint lists, you need a way to build c=great visualizations and insights against that data too.
Anyone who creates flowcharts or SharePoint workflows can now use Visio to design Microsoft Flow workflows. Files are stored in the SharePoint library for group connected forms.
Automation with content trust
Your automation systems that pull or build images can also work with trust. Any automation environment must set DOCKER CONTENT TRUST either manually or in a scripted fashion before processing images. DOCKER CONTENT TRUST ROOT PASSPHRASE. DOCKER CONTENT TRUST REPOSITORY PASSPHRASE. Docker attempts to use the contents of these environment variables as passphrase for the keys.
Docker push docker/trusttest:latest The push refers to a repository a9539b34a6ab: Image already exists b3dbab3810fc: Image already exists latest: digest: sha256:d149ab53f871 size: 3355 Signing and pushing trust metadata. When working directly with the Notary client, it uses its own set of environment variables. Before running the docker build command, you should set the environment variable DOCKER CONTENT TRUST either manually or in a scripted fashion. You cannot build an image that has a FROM that is not either present locally or signed. Docker build -t.
docker/trusttest:testing. Using default tag: latest latest: Pulling from docker/trusttest b3dbab3810fc: Pull complete a9539b34a6ab: Pull complete Digest: sha256:d149ab53f871. Unable to process Dockerfile: No trust data for notrust trust, security, docker, documentation, automation.
Security Content Automation Protocol
The Security Content Automation Protocol is a synthesis of interoperable specifications derived from community ideas. Community participation is a great strength for SCAP, because the security automation community ensures the broadest possible range of use cases is reflected in SCAP functionality. This Web site is provided to support continued community involvement. From this site, you will find information about both existing SCAP specifications and emerging specifications relevant to NIST’s security automation agenda. You are invited to participate, whether monitoring community dialog or leading more substantive activities like specification authorship.
NIST’s security automation agenda is broader than the vulnerability management application of modern day SCAP. Many different security activities and disciplines can benefit from standardized expression and reporting. We envision further expansion in compliance, remediation, and network monitoring, and encourage your contribution relative to these and additional disciplines. NIST is also working on this expansion plan, so please communicate with the SCAP Team early and often to ensure proper coordination of efforts.
Automating Web Content Discovery
Automating content discovery to get alerts when new content is pushed to a website. In this post he was talking about DNS asset identification, you can see that he also has set up web content discovery as well. To test it’s functionality I simply created a file called ‘secret’ and pushed it to a webroot. DirBuster is mostly good for directories and not files, as it did not report that it found the ‘secret’ document. DirBuster failing to find the ‘secret’ file.
Using the same wordlist that I used with DirBuster, I ran wfuzz and it correctly found the file and reported it. Previously I had used my own wordlist just to test that it would find the files that needed to be there. Now we need a bigger wordlist that contains common files and directories. Run the profiler and log the files that it found to a file in the directory. Now, assume the website changes and run the profiler again outputting to the same file as before.
To see if there are any changes between the file now, and the file in the past version use the command. Next post should go over automatically downloading the newly found files, that way if there is an update that gets retracted, you will still have the content.
Avoiding inconsistencies in the Security Content Automation Protocol
Abstract: The Security Content Automation Protocol provides a standardized approach to specifying system configuration, vulnerability, patch and compliance management. SCAP comprises a family of existing standards, such as the Open Source Vulnerability Language and the Common Platform Enumeration. Defining new or extending existing SCAP content is non-trivial and potentially error-prone. Specifying a vulnerability in OVAL may appear straightforward the challenge is to specify the vulnerability in such as way that it is consistent with respect to, not just other OVAl data, but also data described under any other standards in SCAP. This paper identifies a number of consistency problems that can occur in SCAP specifications and these are illustrated using examples from existing OVAL, CPE, CVE and CCE repositories.
It is argued that an ontology-based approach can be used as a means of providing a uniform vocabulary for specifying SCAP data and its relationships. A SCAP ontology is developed based on Semantic Threat Graphs and it is argued that its use can help to ensure consistency across large-scale SCAP repositories.