10 Jun 2018 @ 10:23 AM 

It suddenly occurred to me in the last week that I don’t really have a proper system in place for not only software downloads here on my website but appropriate build integration with the Source Control in building the projects as needed when commits are made. Having set up a Jenkins build environment for the software I work on for my Job, I thought it reasonable that I make the same demands of myself.

One big reason to do this, IMO, is that it can actually encourage me to create new projects. The idea of packaging up the result and making it more easily accessible or usable is often a demotivator I find for creating some new projects. Having an established “system” in place whereby I can make changes on github and have say, installer files “appear” properly on my website as needed can be a motivator- I don’t have to build the program, copy files, run installation scripts, etc. manually every time- I just need to configure it all once and it all “works” by itself.

To that end, I’ve setup Jenkins appropriately on one of my “backup” computers. It’s rather tame in it’s capabilities but it should get the job done, I think- only 4GB of RAM and an AMD 5350. I would use my QX6700 based system, but the AMD system uses far less power. I also considered having Jenkins straight-up on my main system but thought that could get in the way, and just be annoying. Besides- this gives that system a job to do.

With the implementation for work, there were so many projects interdependent and we pretty much always want “everything” that I just made it a single project which builds all at once. This way everything is properly up to date. The alternative was fiddling with 50+ different projects and figuring out the appropriate dependencies to build based on when other projects were updated and such- something of a mess. Not to mention it’s all in one  repository anyway which goes against that idea as well.

In the case of my personal projects on Github, They are already separate repositories. So I will simply have them built as separate projects, with jenkins itself understanding upstream/downstream, I can use that as needed.

I’ve successfully configured the new jenkins setup and it is now building BASeTris, a Tetris clone game I decided to write a while ago. It depends on BASeScores and Elementizer, so those two projects are in jenkins as well.

BASeTris’s final artifact is an installer.

But of course, that installer isn’t much good just sitting on my CI Server! However, I also don’t want to expose the CI Server as a “public” page- there are security considerations, even if I disregard upload bandwidth issues. To that end, I constructed a small program which uploads files using SSH to my website. It will run once a day and is given a directory. It will look in all the immediate subdirectories of that directory, get the most recent file, and upload it to a corresponding remote directory if it hasn’t already been uploaded. I configured BASeTris to copy it’s final artifact there into an appropriate folder.

Alternatively, it is possible to have each project configured to upload the artifacts via SSH as a post-build step. However, I opted to not do that because I would rather not a series of changes throughout the day result in a bunch of new uploads- those would consume space and not be particularly useful. Instead, I’ve opted to have all projects that I want to upload uploaded once a day and only if there have been changes. This should help reduce redundancy (and space usage) of those uploads.

My “plan” is to have a proper PHP script or something that can enumerate the folders and provide a better interface for downloads. If nothing else I would like each CI projects folder to have a "project_current.php” file which automatically sends the latest build- then I can simply link to that on blog download pages for each project and only update it to indicate new features or content.

As an example, http://bc-programming.com/downloads/CI/basetris/ is the location that will contain BASeTris version downloads.

There is still much work to do, however- the program(s) do have git hash metadata added to the project build, so they do have access to their git commit hash, but currently they do not actually present that information. I think it should for example be displayed in the title bar, alongside other build information such as build date, if possible. I’ve tried to come up with a good way to have the version auto-increment but I think I’ll just tweak that as the project(s) change.

Heck- the SSH Uploader utility seems like a good candidate for yet another project to add to github, if I can genericize it so it isn’t hard-coded for my site and purpose.

Posted By: BC_Programming
Last Edit: 10 Jun 2018 @ 10:23 AM

EmailPermalinkComments (0)
Tags
 12 Mar 2015 @ 12:58 AM 

For the last while I’ve been trying to setup an automated build script + CI server for our software. Because of how the repository is laid out as well as limitations of the software available, it has become quite an adventure.

The ‘build script’ itself is effectively a batch file. The idea is that developers may want to perform a full rebuild manually, outside of a CI server. Originally, we were going to use TeamCity because it has built-in .NET support, being able to just build a solution directly. This did have several problems though. The main one being that the free version is limited to 20 Build configurations. Not an unreasonable limitation, but it is problematic since we need to workaround it. it was partially avoided by adding a single job per branch,but each branch would require a build configuration and we would eventually hit 20 supported branches. The workaround was to have only one build configuration that pulled down the entire repository and have build steps build each branch- but hacks are probably the last thing we want. Also, it could probably be argued that trying to workaround this sort of license limitation in otherwise commercial software is kind of unethical. There is unlikely to be a good justification to pay the 2K for the required license to use- and if we can find a better alternative, that will be preferred. That is where Jenkins came in.

Jenkins is a continuous integration server based on Hudson. It is originally designed for Java and has great Java support, but if we want to build .NET stuff, we need to effectively do it manually. Of course, at this point I’ve already constructed an elaborate build script that we can launch directly with a few parameters, so I went ahead and did that.

I was hoping for but not expecting immediate success. What ended up happening was a full day of trying to figure it out. My first issue was that one of the programs failed to find a referenced assembly, because it couldn’t find “C:\Program Files %20×86%21\Jenkins\Job…” Which is sort of obvious given that the path has had parentheses escaped. I researched the issue and discovered that this was actually a limitation of MSBuild, and certain tasks were apparently busted- like a flustered warden, it just couldn’t deal with escapes. The reason I had not encountered this with TeamCity as it turns out was simply because I had installed to C:\TeamCity (the default location). In hindsight I wondered why it was installing to that location, but now it seems that it did so because of .NET limitations (TeamCity being .NET centric in some regard it makes sense they would avoid MSBuild-related issues).

Excellent! I found the problem. So I uninstalled Jenkins, rebooted, reinstalled Jenkins to C:\Jenkins, recreated the Job… and was met with disappointment. This time, the build failed with a curious error from NotifyPropertyWeaver.dll. we have this DLL set as part of a post-build step for ome of our libraries- coincidentally the same one that was having problems, as previously it was this dll that claimed it couldn’t be found.

I was running Visual Studio 2015 CTP6, so I thought, “ah, it must be a issue with the Roslyn compiler, and NotifyPropertyWeaver.dll cannot understand the new compile format! Of course! Already patting myself on the back, I began the grueling effort of uninstalling Visual Studio CTP6 and installing Visual Studio 2013. It kind of makes me feel bad since as an MVP I feel obligated to at least try out new Language features and try to keep on top of such technologies, but the choice had to be made.

After finally getting VS 2013 installed, I attempted to run the build again- and I was greeted with the same error message. Well so much for that theory, it seems. And I was so sure it was the cause!

I was kind of out of ideas at this point, so I started doing some experiments. I found that I couldn’t build it manually, either. I was able to build the same build script checked out to my user folder without issue. Then I tried running as administrator and then building in my user folder- I found I was meeting the same error.

Based on this I changed the Security ACL on the C:\Jenkins folder to include full control for all authenticated users- and Jenkins was able to compile all the .NET programs successfully. (Since I hadn’t installed the JDK it hit a wall with the Java applications we have in our codebase which was to be expected).

It was a simultaneously interesting and exciting adventure. It is disappointing since I spent the whole day trying to figure it out when my intention was to sort out how build artifacts were created.

Posted By: BC_Programming
Last Edit: 12 Mar 2015 @ 12:58 AM

EmailPermalinkComments (0)
Tags

 Last 50 Posts
 Back
Change Theme...
  • Users » 46430
  • Posts/Pages » 378
  • Comments » 105
Change Theme...
  • VoidVoid « Default
  • LifeLife
  • EarthEarth
  • WindWind
  • WaterWater
  • FireFire
  • LightLight

PP



    No Child Pages.

Windows optimization tips



    No Child Pages.

Soft. Picks



    No Child Pages.

VS Fixes



    No Child Pages.