<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Things and Stuff</title>
    <link>https://6fx.eu/</link>
    <description>Recent content on Things and Stuff</description>
    <generator>Hugo -- gohugo.io</generator>
    <language>en-us</language>
    <lastBuildDate>Thu, 17 May 2018 12:25:00 +0100</lastBuildDate>
    
        <atom:link href="https://6fx.eu/index.xml" rel="self" type="application/rss+xml" />
    
    
    
    <item>
      <title>CI and Autostaging</title>
      <link>https://6fx.eu/ci-and-autostaging/</link>
      <pubDate>Thu, 17 May 2018 12:25:00 +0100</pubDate>
      
      <guid>https://6fx.eu/ci-and-autostaging/</guid>
      <description>In my last post about leveraging GitLab CI and Autopkg to build packages I mentioned that all packages falling out of the CI pipeline are first put into the testing catalog. I have used munki-staging.py and a Trello board to shift packages around from the testing environment into production in a mostly manual process. Since then I have been thinking about how to integrate the staging into that CI process.</description>
    </item>
    
    
    
    <item>
      <title>CI That Autopkg</title>
      <link>https://6fx.eu/ci-that-autopkg/</link>
      <pubDate>Wed, 02 May 2018 10:03:27 +0100</pubDate>
      
      <guid>https://6fx.eu/ci-that-autopkg/</guid>
      <description>In my infrastructure here, I am using munki to take care of all my software distribution needs. The majority of software packages is built with the excellent Autopkg on a dedicated MacMini. Until now this was a manually triggered process. I maintained a list of recipes and from time to time (usually daily) I would run something like autopkg run --recipe-list=my_recipes.list. Afterwards I would manually rsync the local munki repository onto the distribution infrastructure.</description>
    </item>
    
    
    
    
    
    
  </channel>
</rss>
