Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin docker API version to support minikube #5

Closed
wants to merge 1 commit into from

Conversation

tobias
Copy link
Contributor

@tobias tobias commented Jun 1, 2017

Minikube currently uses Docker 1.23, and Openwhisk uses a newer
client. But Openwhisk doesn't currently require anything that isn't
provided by 1.23, so pinning the API version to 1.23 allows Openwhisk to
be used with minikube.

I've signed the CLA as "Tobias Oliver Crawley".

Minikube currently uses Docker 1.23, and Openwhisk uses a newer
client. But Openwhisk doesn't currently require anything that isn't
provided by 1.23, so pinning the API version to 1.23 allows Openwhisk to
be used with minikube.
@animeshsingh
Copy link

Thanks @tobias - LGTM
@DanLavine please review it

@mrutkows
Copy link
Contributor

mrutkows commented Jun 5, 2017

See:
https://github.com/kubernetes/minikube/blob/master/pkg/minikube/constants/constants.go

which pins the version using this constant:
// DockerAPIVersion is the API version implemented by Docker running in the minikube VM.
const DockerAPIVersion = "1.23"

@mrutkows
Copy link
Contributor

mrutkows commented Jun 6, 2017

@tobias Before accepting the PR (and having never used minikube), your requested change to pin the client seems reasonable; however, are you claiming that this is the only change needed to support minikube? If so, I would like to see documentation to that effect as now we state it is "not supported" due to docker engine version dependencies. Also, if we go down this road, we will need an integration test(s) to verify minikube installation/deployment.

@mrutkows
Copy link
Contributor

mrutkows commented Jun 6, 2017

@animeshsingh apart from "LGTM", can you independently verify this is the only change to get a minikube deployment working?

@ghost
Copy link

ghost commented Jun 6, 2017

Hey, sorry for the delay, I was off last week. This seems fine to me, but like @mrutkows said. It would be great if we could get an update to the Docs so that note about Minikube not being supported is removed, and possibly a link to Minikube as a supported deployment option.

@bbrowning
Copy link
Contributor

The last time I tried minikube, you still had to minikube ssh and ip link set docker0 promisc on to get things working.

@mrutkows
Copy link
Contributor

mrutkows commented Jun 6, 2017

Seeing there may be more to this (based upon @bbrowning 's comment) can we get an integration test going?

@animeshsingh
Copy link

@mrutkows so it works with "minikube ssh and ip link set docker0 promisc on" mod. We have created a developer journey targeting bluemix container service and minikube, and are in the process of finalizing travis for both.

@mrutkows
Copy link
Contributor

mrutkows commented Jun 7, 2017

@animesh it is preferable that we include docs/code/travis, etc. enablement here for Minikube within this Apache repo.. I prefer not enabling something (and pinning an old CLI) without proving it works as part of our integration testing and making it accessible to the community. It should "work" here and be validateable from this repo's source code through "manual" docs by others (or automation means).

If you are working on a solution that includes travis, docs, etc. then feel free to submit a WIP pull request with all these changes so they can be accessible relative to the Apache community and this repo. It is clear others are interested in this work and this is place we should develop it.

@mrutkows
Copy link
Contributor

mrutkows commented Jun 7, 2017

@tobias Can we make the invoker.yml optionally "pin" the version only for a Minikube deployment through some build target environment via Ansible, that is create a "minikube" target? Obviously, pinning of client is specifically for Minikube and not required for general kube deployments; why should we hard code this restriction?

@animeshsingh
Copy link

@mrutkows agreed. At this time the conclusion is minikube wont work with Travis - if we figure out a way we will add it back here as well

@tobias
Copy link
Contributor Author

tobias commented Jun 7, 2017

@mrutkows Yes, we could make this an optional via Ansible. Another option is to wait for the next release of minikube - it should have a newer docker in it (kubernetes/minikube#1542). Though I don't know when that release will be out.

@mrutkows
Copy link
Contributor

mrutkows commented Jun 7, 2017

@tobias I'm not saying "wait"; so perhaps I was being too harsh...

all I want is that IF we want to add Minikube as a supported deployment we do not simply "pin" the client version (and the work is done elsewhere), but figure out how we submit a PR that works towards a side-by-side deployment that include a way to document to others this intent and how to build/config/install, as well as Travis integrations to show it runs.

I would never go to a project and say "add this library for X" because I need it for my work elsewhere... that is what this feels like.

My hope is that we can somehow "pin" versions of other software so that it is not done for every possible deployment (only done for Minikube deployments in this case). Start with this approach and submit a PR with these things (separate subdir with Minikube environments, author a Minikube readme and document the caveats, version issues, pre-reqs. etc.).

@mrutkows
Copy link
Contributor

mrutkows commented Jun 7, 2017

@animeshsingh thanks Animesh, we know Minikube is very popular and we want it to work here and be supportable by this community.

@tobias
Copy link
Contributor Author

tobias commented Jun 8, 2017

@mrutkows I only mentioned waiting since it seemed like wasted effort to add special-case ansible configuration to support something that may be a non-issue with the next minikube release. However, since minikube does require setting promiscuous mode as well, it's already a special case, so I agree that having ansible config for it makes sense. I can take a look at adding that, along with a section to the readme about usage. I'll do that as a new PR, and we can close this one.

What is the issue with Travis? Is it that they don't support spinning up full VMs? If we can't test minikube via CI, is it still acceptable to add it to the README, but with caveats?

@tobias
Copy link
Contributor Author

tobias commented Jun 8, 2017

I just noticed IBM/OpenWhisk-on-Kubernetes#22 - @Tomcli - mind if I use that as a basis for the minikube readme here?

@mrutkows
Copy link
Contributor

mrutkows commented Jun 8, 2017

@tobias it is desirable (or rather imperative) to have Travis CI confirmation of any deployment (target) that we wish to claim support for. In this way, we can assure future changes do not break that support when done for other deployments. It also services as a means to prove (by running example) that it works for developers who wish to replicate. It is simply best practice.

@tobias
Copy link
Contributor Author

tobias commented Jun 8, 2017

@mrutkows I agree, CI for this is critical, and I'm well aware of the benefits of CI. But according to @animeshsingh's comment, minikube doesn't (yet) work on Travis. So the choice is to add support for minikube now (with caveats about it only being manually tested), or waiting until it is supported by Travis (or adding another CI provider to the mix).

@Tomcli
Copy link

Tomcli commented Jun 8, 2017

@tobias Sure. FYI, IBM/OpenWhisk-on-Kubernetes#22 is just a temporary solution for those who want to try it on Minikube right now. We still need to create a feasible solution in the future.

@animeshsingh
Copy link

@tobias our primary target in IBM/OpenWhisk repo is Bluemix container service with Kubernetes, and we are running CI against that. For minikube @Tomcli mentions, we will rely on the core repo and redirect users here. So what we have with minikube right now is temporary way to get going, until a formal support is added in core.

@mlangbehn
Copy link
Contributor

Sorry I missed the conversation here for a little bit. @tobias and @mrutkows, the issues @animeshsingh mentioned with running Minikube in Travis CI are that you can not create a VM with either VirtualBox or KVM in Travis CI (regardless of running Travis CI VMs or containers). I suspect it would take a major change to Travis CI VMs to have support.

As a work around (really more of a give up and redirect around) I am basing our Travis testing in IBM/OpenWhisk-on-Kubernetes off the work done in this repo.

@mrutkows
Copy link
Contributor

mrutkows commented Jun 8, 2017

@mlangbehn Thanks for the update. If we need to support Jenkins (or other) we can discuss, but sad to see you doing your work in some other IBM repo. as this should be part of the Apache community project.

@mrutkows mrutkows closed this Jun 8, 2017
@mrutkows
Copy link
Contributor

mrutkows commented Jun 8, 2017

Please submit a new PR which would have a README, separate directory structure and working Travis or Jenkins (or close to working). It should also include a strategy to allow "pinning" (install/config) of other dependencies that are specific to Minkube to be selectively included/changed.

AND please mark it "experimental"

@tobias tobias deleted the minikube-support branch June 12, 2017 15:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants