SSIS isn’t always the easiest tool to debug or troubleshoot. It can be especially challenging to use script tasks and components due to their limited support of C#. By limited support I mean:
- Not being able to leverage NuGet
- Generating and Tearing down projects on the fly
- Limited to using the version of Visual Studio(VS) that aligns with the SQL Server version running the SSIS package
With this in mind, it can be very helpful to relegate all the additional work to a dll and leverage that over the a blend of SSIS and script tasks/components.
The standard recommendation is to load dll’s into the GAC. I personally don’t like this due to the fact that it’s shared at the machine level. This means you’re deployment could ruin my packages by upgrading/downgrading one of my dll’s. I’ve found a way to load assemblies not in the GAC online.
I have a few contentions with this solution
- Every dll must be explicitly added
- There is no logging
- file not found
- dll not included
I’d like to propose my own solution below.
This solution provides some notable benefits.
- You don’t need to explicitly state every dll
- You can setup a common location for the dll’s
- You can pass the dll’s in as a readonly parameter
- Any loaded or missing dll will get logged
Unfortunately this will only work for script tasks and not script components.
Provisioning environments and access isn’t something that I enjoy, but it’s really important. People may not be able to work or be as productive as they could be without their environments. People join and leave teams all the time.
This script pulls all the users from a given AD Group. It grants them OrgAuditor access to the PCF Org.
- PCF CLI
PowerShell Active Directory module
You’ve got Splunk installed and configured. You setup an HTTP Event Collector(HEC) and its appropriate index(es). You try to use the following to post to it,
but get the following response.
I was able to correct the issue by switching off the Enable indexer acknowledgement for that HTTP Event Collector Token.
- Go to
HTTP Event Collector
- Find the correct HTTP Event Collector
Enable indexer acknowledgmenet
- Then you’ll get the below successful response
You can read more about this issue on Splunk Answers.
I wrote in a previous article about how to setup NuGet.Server for yourself. Here are some additions to the web.config file that I would recommend.
requireApiKey makes sure that only trusted uploaders can contribute packages. While your private instance may be self-hosted and only internal, it’s still a best practice to restrict who can contribute packages.
apiKey simply allows you to define the desired key that will be shared with NuGet package contributors.
allowOverrideExistingPackageOnPush prevents an updated package overwriting an existing package. I am very against this, because it breaks the contract with the consumer. As the consumer, I should be able to count on the fact that once a NuGet package is published as a particular version, it won’t change.
As a software development company grows, there’s a natural need to re-use code. This has been true for a long time will remain so. There are many reasons.
- Reduce the maintenance costs through reducing lines of code
- Helping maintain consistency across projects
- Eliminating bugs in a single place
Open source libraries handle the lion’s share of this. There’s often domain or company specific code that ends up getting duplicated. The way we solve this problem has changed.
Today for .NET, I recommend using NuGet to package and version your re-usable code. Quite a few companies will be a little squeamish(with good reason) about posting any proprietary code publicly.
Fortunately, you can stand up your own private Nuget Server. I found the original instructions on Scott Hanselmann’s blog, but have added Elmah for help with troubleshooting.
- Open Visual Studio 2017
.NET Framework 4.6
- Select Empty
- Open the Package Manager Console
- Install the Nuget.Server package with the command
- Install elmah package with the command
- Right click on the project
Manage NuGet Packages...
I Accept to accept all the license terms
- You may have to restart Visual Studio 2017 as part of this process
- Then run your project
- You should see a screen like below
Grant the account running the site read and write permissions on the folder being used to store the packages. By default, App Identity Pool did not have sufficient permissions. It’s configured in your web.config.
- Fun Fact: Nuget must hash the packages, because just changing the filename and trying to reload will cause an error. You’ll get the below response.
You’ve worked hard creating unit tests using MSTest. That’s a great start. This doesn’t mean you’re finished though.
- How do you know that the tests are being run?
Running the tests
Jenkins makes this part easy. There are two different types of builds that you can choose from. I’ll share how to do run unit tests with MSTest for both types.
You can add this command within a build step of a Freestyle Project or a Pipeline Project. I am assuming the build will run on a windows node or master.
The location of MSTest will vary based on the install location and version of Visual Studio. The
/resultsfile:"%WORKSPACE%\Results.trx" parameter stores the test output to the specified file. You can use the
/testcontainer:"%WORKSPACE%\MyTests.dll" parameter to test many assemblies and add all results into a single results file.
Security isn’t easy, but its
becoming more important. There’s lots of evidence explaining the dangers of missing any flaws. One of the items that got flagged on a project that allowed IFrames from any other site. The findings referenced the X-Frame-Options header. In my particular case, the business wanted to allow IFraming across domains. This ruled out using
ALLOW-FROM would’ve fit the bill if it were supported. For MVC, you can leverage built in web.config values or ActionFilter Attributes. I was supporting a webforms site though.
For my case, I had to write some custom code. It is below. IIS can leverage the produced HttpModule
. Values from the web.config allow only specific sites to iframe the site the web.config belongs to. It assumes that many sites will be semi-colon delimited.
Your web.config file could look like below for example.
Splunk Enterprise offers a great solution for anyone that has legal or compliance reasons requiring an on-premise setup. It’s very useful for developers that would like to do testing in a locally destructive fashion. One of the keys to creating an easy to maintain environment is getting authentication and authorization right. In my case, the vast majority of users belong to a shared Active Directory(AD) Domain.
Splunk Enterprise does offer its own store of users. The reason for managing them with AD is that when people require access changes(leaving/joining teams/the company) having them all in one place makes this much simpler. Particularly if other systems already use this as a point of reference.
There are many sites that reference configuring Splunk Enterprise for AD authentication/authorization. I haven’t found any that go into enough detail to make it simple. I’ve attempted to do that below.
LDAP Configuration for User Roles
- Click Settings
- Under USERS AND AUTHENTICATION, Click Access controls
Click Authentication Method
Under External select LDAP
Click LDAP Settings
Pro Tip: Make sure the Group base DN for groups points to an OU with all the groups rather than the root
Make sure the user base DN is the root of AD
- To Map Groups
- Under Actions Click Map Groups
- You can click any group from the group base dn you provided
- Then you can select the roles that a given AD Group will have
- Manage Splunk user roles with LDAP
You have two great tools that you’d like to integrate. In this case, you’re using BitBucket and Jenkins. You can configure Jenkins to check BitBucket for changes(aka polling) to make a build. But this is clunky and repetitive.
There is a better way. BitBucket offers a plugin called “Webhook to Jenkins for Bitbucket“. This plugin calls Jenkins for each new commit to a repository. This way Jenkins doesn’t call BitBucket, BitBucket calls Jenkins. It’s The Hollywood Principle, “Don’t call us, we’ll call you”.
Now like so many times in programming, your solution to one problem has created another. In debugging, this is progress. You need to know how to stitch this together. You’ll be able to configure it by clicking Edit (the pencil icon) to bring up the below screen. Once you enter all the information, click
Trigger Jenkins to test the connection. You may see the following error.
Temporary failure in name resolution
You may need to provide with the fully qualified domain name for the Jenkins instance. The machine name alone will not work(e.g.
awesome_machine). You need to enter the fully qualified machine name in the Jenkins url. Assume the fully qualified machine name is awesome_machine.awesome.domain. This would make your url look like
http://awesome_machine.awesome.domain:3456(assuming the port is 3456). Once you do that, you’ll get a new error.
New Problem (Again!)
Once you click
Trigger Jenkins, you may get an error stating
No Git jobs using the repository.
New Solution (Again!)
To work around this, you can configure the trigger for the job to poll the scm without a schedule. You can do this by clicking
Poll SCM and leaving the
Schedule text area blank. You can see an example below.
It’s important to note that despite the above setting, Jenkins will never poll Git.
- Github: Debugging “Error: Jenkins Response: No git jobs using repository” #147
- Webhook to Jenkins for Bitbucket
Jenkins provides an outstanding open source continuous integration platform for a multitude of languages and technologies. It accomplishes this by allowing the community and other interested parties to contribute plugins. Many of these plugins are frequently updated, which is amazing! Even though Jenkins has a pretty nice user interface (UI) for updating plugins, it gets tedious since on a system of scale, there could be updates daily.
Fortunately for me, Jenkins provides a really straight forward command line interface (CLI). This allowed me to create a powershell script that will update all of the installed plugins to their lastest version. I configured this to run weekly and it’s been a huge time saver. The added benefit is that you get all the latest features for your plugins without doing anything.
I configured it to send an email out with the plugins that have been updated. I had to copy the powershell script into a gist to make it display correctly here, but here is the proper repository in case you are interested.