Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create and document a more systematic automated testing procedure #483

Closed
machawk1 opened this issue Aug 8, 2018 · 9 comments · Fixed by #484
Closed

Create and document a more systematic automated testing procedure #483

machawk1 opened this issue Aug 8, 2018 · 9 comments · Fixed by #484

Comments

@machawk1
Copy link
Member

machawk1 commented Aug 8, 2018

While writing tests in 1bf14a3, I could not wield my local Python, Pip, IPFS, and IPWB into a consistent environment to obtain the same results as TravisCI reported in PRs.

With @ibnesayeed's help, I was able to get replicable testing using our current Docker build and configuration but the commands needed to perform this operation along with setting up the environment (e.g., installing requirements per the .travis.yml) are things I will forget.

@ibnesayeed mentioned a Dockerfile could also be created for testing. Let's do and document this to replicate the TravisCI & pytest tests and have a more consistent, replicable test-running environment independent of the CI system.

docker container run --rm -it -v "$PWD":/ipwb -p 5000:5000 oduwsdl/ipwb bash

Once in container:

pip install -r requirements.txt
pip install -r test-requirements.txt
pip install pycodestyle
pip install codecov
pip install pytest-cov
pycodestyle
ipfs init
ipfs daemon & sleep 10
py.test --cov=./
@ibnesayeed
Copy link
Member

I have created a Dockerfile.test with the following content to create a test image, but I think I will make testing as part of the build process so that documentation is minimal and image builds fail if tests fail.

FROM       oduwsdl/ipwb

RUN        pip install pycodestyle codecov pytest-cov

COPY       test-requirements.txt ./
RUN        pip install -r test-requirements.txt

CMD        pycodestyle && ipfs daemon & while ! curl -s localhost:5001 > /dev/null; do sleep 1; done && py.test --cov=./

@ibnesayeed
Copy link
Member

Looking at the requirements.txt file, I am wondering why does it contain pytest?

pywb==0.33.2
ipfsapi>=0.4.2
Flask==0.12.2
pycryptodome>=3.4.11
pytest>=3.6

Looking at the test-requirements.txt, I am wondering why does it not contain some dependencies such as pycodestyle and codecov that are being installed separately?

flake8>=3.4
pytest>=3.6
pytest-cov
pycryptodome>=3.4.11
pytest-flake8

@ibnesayeed
Copy link
Member

I have made some changes as per #484 to perform tests during the image building process. Now testing can be performed using the following command locally:

$ docker image build -t oduwsdl/ipwb .

The first time it might take a while to to install dependencies, but successive builds will be fast and they will quickly reach to the testing stage.

@machawk1
Copy link
Member Author

machawk1 commented Aug 8, 2018

@ibnesayeed In addition to these variable, inconsistent requirements files, there is also a set of requirements defined in setup.py. This was needed due to the installation procedure (e.g., python setup.py install vs pip install . vs pip install ipwb), I believe, so if we are futzing with them, we ought to test at least these three installation procedures to ensure consistency. The better approach would be to have 1 set of requirements across the board but I do not know the required dynamics behind this.

@ibnesayeed
Copy link
Member

Current master branch fails python setup.py install due to conflicting dependencies of Jinja2, unless pip install -r requirements.txt is run before that.

pip install . works fine, with and without my changes.

pip install ipwb works, but reports the Jinja2 conflict issue.

I have synchronized dependencies in the setup.py file with those in the two requirements files, but the above described situation persists, which is no worse than the current state.

@ibnesayeed
Copy link
Member

The better approach would be to have 1 set of requirements across the board but I do not know the required dynamics behind this.

Pipfile is a potential improvement as per #366.

@ibnesayeed
Copy link
Member

By the way, if you want to test installation experiences using different commands on a clean slate, you can use Docker like this (assuming you are in directory where ipwb code lives):

$ docker container run --rm -it -v "$PWD":/ipwb -w /ipwb python:2.7 bash
# pip install .

Then you can exit from the container and repeat the process for the next installation method.

@machawk1
Copy link
Member Author

machawk1 commented Aug 8, 2018

Related question, @ibnesayeed: should we have Travis also be running tests for our ipwb Docker configuration? There exists the potential for a discrepancy if we are not diligent about keeping the Docker and native Python versions consistent.

@ibnesayeed
Copy link
Member

Docker builds will fail if the tests fail there, so no need to retest that separately. You can enable email notification for build failures on DockerHub.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants