x
login Signup

How can I access the payload info of a REST call against ElectricCommander?

I'm trying to set up github automated deployments with github's deployment api. Github allows you to set up web hooks against a 3rd party deployment server where github will send deployment events, pushes, pulls, and other useful data against your rest enabled server.

https://developer.github.com/v3/repos/deployments/

Is there a way to access the payload info of a REST call against ElectricCommander?

I've set up the webhook to send the github events as JSON encoded payload against a runProcedure call (procedure does a hello world echo)

https://USERNAME:PASSWORD@COMMANDER_SERVER_HOSTNAME:8443/rest/v1.0/jobs?request=runProcedure≺ojectName=GithubDeploy≺ocedureName=Deploy

I can see the REST event in the commander logs and the job runs successfully (no params defined). Github even shows the response from EC with the 36 char job ID as well.

Is there a way to access this payload? Perhaps have EC somehow save payload info into a job property?

Payload from github looks something like this (example from github)

 {
   "action": "closed",
   "number": 7,
   "pull_request": {
     "url": "https://GITHUB_SERVER_HOSTNAME/api/v3/repos/GITHUB_ORG/GITHUB_REPO_NAME/pulls/7",
     "id": 555,
     "html_url": "https://GITHUB_SERVER_HOSTNAME/GITHUB_ORG/GITHUB_REPO_NAME/pull/7",
     "diff_url": "https://GITHUB_SERVER_HOSTNAME/GITHUB_ORG/GITHUB_REPO_NAME/pull/7.diff",
     "patch_url": "https://GITHUB_SERVER_HOSTNAME/GITHUB_ORG/GITHUB_REPO_NAME/pull/7.patch",
     "issue_url": "https://GITHUB_SERVER_HOSTNAME/api/v3/repos/GITHUB_ORG/GITHUB_REPO_NAME/issues/7",
     "number": 7,
     "state": "closed",
     "locked": false,
     "title": "Hubot scripts to interact with EC Flow and Grafana dashboards.",
     "user": {
       "login": "GITHUB_USER_2",
       "id": 28,
 .....
 
     "merged": true,
     "mergeable": null,
     "mergeable_state": "unknown",
     "merged_by": {
       "login": "GITHUB_USER",
       "id": 16,
       "avatar_url": "https://GITHUB_SERVER_HOSTNAME/avatars/u/16?",
       "gravatar_id": ""
 ....
 }

point of job would be to filter out event type (with action payload var) and merge status (merged=true) to validate that merge was success and then kick off a deploy with latest merged code.

avatar image By dpmex4527 45 asked Sep 16, 2015 at 09:54 PM
more ▼
(comments are locked)
10|750 characters needed characters left

4 answers: sort voted first

Really really short answer: No, there is no practical means to access the payload from this REST API call.

Short answer: You don't want to do this in this fashion - instead you want to create a simple web service that gets the event from github, marshalls the payload along with any other parameters, and in turn uses the Commander/Flow API to run the procedure (passing in the payload as an argument value).

Long answer:

First, the reason you don't really want to have the Flow/Commander REST API capture the payload is that you really don't want to have your entire Flow/Commander port open to the internet! Even if nobody hacks your server, I can imagine a lot of harm being done to your internal user community by a simple DDOS attack on that port.

The second reason is that in order for this to work, you'll need to have valid credentials for your Flow/Commander server stored on a public site (github) -- and I don't care how secure they SAY that is, I just know that no security team I've ever encountered would let a valid user/password for an internal service opened to the internet to be stored in that fashion. Especially when the alternatives are so easy and so much better.

Here's one - I prototyped this not too long ago, so I know it works. Stand up a simple web server, configured so that it has a single valid page (URL) that launches a single CGI (written in PHP or even Perl -- not a shell script; they're not secure). The CGI script captures the external event and its payload, and does some basic validation of same (to protect the Commander/Flow server from someone trying to hack your runProcedure call by deliberately passing in carefully-crafted JSON, for example). The CGI should also do some throttling -- it should touch a file and compare the date-time stamp, for example, to limit the calls it makes to the Flow/Commander API to some reasonable number of runProcedures/minute. This will make sure that it isn't being used for a DDOS attack on your server - whether that be malicious or due to configuration error or programming error back on github or whatever. You'll note that using this mechanism, it is this internal server that will contain the credentials for your Flow/Commander server, so that they do not have to be given to github - your security team will feel better about that for sure.

I'd recommend that this be done on an existing web instance in your DMZ - but for prototyping, I mocked it up using a simple CGI plugin on the Commander server itself, since my application did not require opening this to the internet. In a nutshell, the standard URL to hit a given CGI plugin in Commander requires that you be logged in - but there's another URL that you can give your Commander web server that hits the plugin CGI directly. When used in that latter method, the running CGI script (ec-perl in my case) does not have a valid session, thus it is relatively safe (NB: I say relatively - it's good enough for internal access, but I would NEVER expose a Commander/Flow web UI to the internet!). The CGI in question does exactly what I described above - in other words, it's a plugin ONLY so that I could use the existing Apache web server for this purpose rather than standing up another web server somewhere.

avatar image By mike westerhof 2.8k answered Sep 17, 2015 at 06:36 PM
more ▼
(comments are locked)
avatar image dpmex4527 Sep 18, 2015 at 02:34 AM

I can always count on your insightful responses Mike! Good thing about this setup is that we're not using the public instance of Github but an internal instance of Github Enterprise that is hosted within our network.

Your suggestions make a lot of sense. It'll likely be a lightweight Sinatra web service and we can certainly utilize a secure throttling method to ensure no malicious traffic is trying to hit our service. We do trust our engineers to not DDOS but any sort of chaos monkey setups can easily be averted with this suggestion.

10|750 characters needed characters left

Assuming the payload is stored in the step's log, you can use the jobStep get call to retrieve the job step details using the jobID for that runProcedure call. Within that you will find logFileName. In the workspace you can then retrieve the log under the folder with the same name as the job, and parse for whatever information you need. Let me know if this helps. :)

avatar image By juan 1.2k answered Sep 17, 2015 at 07:51 AM
more ▼
(comments are locked)
avatar image dpmex4527 Sep 17, 2015 at 02:20 PM

Unfortunately payload is not stored in the step log. It is visible within commander's server log when the REST backend detects rest call and initiates job. I want to be able to parse github events payload sent to commander by accessing the body of the POST request.

10|750 characters needed characters left

Parameterize your procedure & pass the payload as a parameter and you can easily access & do whatever you want with that parameter. Am I missing something (Unfortunately we haven't upgraded to latest version yet, otherwise i would have done some testing).

avatar image By vijaybandari 15 answered Mar 22, 2016 at 04:27 AM
more ▼
(comments are locked)
10|750 characters needed characters left

I'm not sure if I 100% agree that the suggested answers are reasonable here. It's not clear to me that whether the EC endpoints are exposed to the open web has anything to do with the ability to examine the request body from within the procedure context; the two ideas seem totally logically separate. In the situation where both the code repo and EC are hosted within the corporate network, which I imagine is quite common, this isn't really a concern. Many source management tools like GitLab provide webhooks with well-known POST body schemas which should make integration with automated build systems easy. Frankly, requiring developers / operations people to stand up an internal (and long-lived, and reliable, etc.) web service just to translate these payloads to URL parameters seems unreasonable. Doubly arcane is the suggestion of examining the build log file during the procedure (!?), which has the additional drawback of uncertainty in log format specification.

If these answers are still current, it's surprising to me that the EC RESTful API specification even describes the 'body' at all -- why include it when it's inaccessible?

Apologies if I've misunderstood any of these answers. I was excited by the prospect of the REST API and disheartened when I learned of this limitation.

avatar image By joreilly 0 answered Apr 10 at 10:49 PM
more ▼
(comments are locked)
avatar image joreilly Apr 10 at 10:49 PM

Just realized that I posted this as an answer; maybe it should be a comment? New to this board.

10|750 characters needed characters left
Your answer
toggle preview:

Up to 8 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this question


Topics:

x997
x48
x28

asked: Sep 16, 2015 at 09:54 PM

Seen: 1328 times

Last Updated: Apr 10 at 10:49 PM

Related Questions