Skip to content

Commit

Permalink
Merge develop to release-0.4 branch (#2830)
Browse files Browse the repository at this point in the history
* Update OptFlow FSL generation code when len(variables) == 2 (#2778)

* change optflow api

* polish

* add more ut

* alps submitter and codegen (#2771)

* test alps submitter

* add alps codegen

* update

* Support download model to local in cli (#2779)

* support download model to local in cli

* update

* update

* update

* Add query api to db.py (#2782)

* Add query to db.py

* change delete with truncate

* Set EnableWindowFunc to be true by default for TiDB parser. (#2786)

* Make the attribute check for XGBoost model compatible with reg:linear

* set EnableWindowFunc to true by default for TiDB parser.

* fix Tensorflow -> TensorFlow (#2783)

* Add optimization guide doc (#2785)

* add optimization guide doc

* polish according to comments

* Generate workflow using runtime (#2784)

* WIP generate workflow using runtime

* wip update

* update

* update

* fix hive ci

* DB api base class (#2787)

* Add query to db.py

* change delete with truncate

* DB interface base class

* add to_dict and from_dict method (#2792)

* Fix develop jupyter image build (#2790)

* fix develop jupyter image build

* update

* Generate workflow step code using runtime fea der (#2791)

* WIP generate workflow step code using runtime fea der

* tested local

* update

* update

* update

* fix tests

* Add MySQL db-api implementation (#2793)

* Add query to db.py

* change delete with truncate

* DB interface base class

* Add MySQL db-api implementation

* remove unused import

* fix actions maxcompute test not running (#2795)

* Enable flake8 check on CI (#2788)

* test ci

* test again

* update

* update and fix

* fix travis ci env

* generate python feacol code (#2797)

* Add hive DB-API (#2798)

* Add query to db.py

* change delete with truncate

* DB interface base class

* Add MySQL db-api implementation

* remove unused import

* polish mysql db-api

* Add hive DB-API

* modify doc

* format code

* modify cora dataset to adapt csv format (#2780)

* Add json dump and load support for FeatureColumn (#2794)

* add json dump load support

* update vocabulary type

* update

* update

* update

* Add maxcompute DB-API (#2801)

* Add maxcompute DB-API

* remove unused import

* format code

* Push images on self hosted machine (#2799)

* push images on self hosted machine

* update

* update

* update

* update

* test install.sh

* fix go mirrors

* clean up

* add clean up

* update clean up script

* fix pai xgboost package deps (#2803)

* Simplify TO RUN command - use filename instead of absolute path for the executable or script program (#2804)

* Make the attribute check for XGBoost model compatible with reg:linear

* Derive the absolute path of the runnable program if users just input a file name.

* Use python -m command to invoke the TO RUN statement in default submitter.

* Move getRunnableProgramAbsPath to alisa.go

* Polish DB-API code, export unified connect function from package. (#2808)

* Add maxcompute DB-API

* remove unused import

* format code

* polish db-api

* add solved y to optimize (#2810)

* Generate couler code of workflow steps (#2806)

* wip

* fix yaml generate

* fix tests

* fix package deps

* fix pip package deps

* update

* Refine metadata collect and save/load (#2807)

* move and refine metadata

* fix ci ut

* fix ut

* follow lhw comment

* Adapt paiio with DB-API (#2809)

* Add maxcompute DB-API

* remove unused import

* format code

* polish db-api

* Adapt paiio with DB-API

* Adapt paiio with DB-API

* add try import paiio

* fix typo

* disable actions maxcompute test (#2814)

* make constraint optional (#2812)

* fix typo (#2820)

* Install BARON solver in Docker image (#2811)

* install baron solver in Docker image

* polish

* add pyomo baron into step docker image

* Polish DB-API to support Python2 so can run on PAI (#2815)

* polish db-api to support Python2 so can run on PAI

* enable unittest for hive db-api

* switch to github actions (#2818)

* Add experimental workflow end2end test (#2813)

* add experimental workflow end2end test

* fix workflow ci env

* update test code

* pull latest step before running workflow

* Add  Model.save_to_oss and Model.load_from_oss (#2817)

* add save_to_oss/load_from_oss

* change pickle protocol

* add more explanations on oss_model_dir doc

* fix ut

* Fix relative importing cause error (#2823)

* fix relative importing cause error

* clean up

* Use unified DB-API in codebase (#2821)

* Add maxcompute DB-API

* remove unused import

* format code

* polish db-api

* Adapt paiio with DB-API

* Adapt paiio with DB-API

* add try import paiio

* use db-api in old code

* DB-API support Python2 so can run on PAI

* polish db-api to support Python2 so can run on PAI

* polish db-api to support Python2 so can run on PAI

* polish db-api to support Python2 so can run on PAI

* Use unified DB-API in codebase.

* Use unified DB-API in codebase.

* polish code

* remove debug info

* fix ut

* Generate workflow step for normal statement run (#2824)

* generate workflow step for normal statement run

* clean up

* build step image before run workflow test

* fix is_query

* Fix pai training with optimizer config (#2828)

* fix pai training with optimizer config

* remove template

* Save the trained xgboost model (#2822)

* save trained xgboost model

* fix flake8 check

* fix ut

* fix ut

* fix workflow ut

* fix cwd error

Co-authored-by: Wu Yi <[email protected]>
Co-authored-by: HongwuLin <[email protected]>
Co-authored-by: brightcoder01 <[email protected]>
  • Loading branch information
4 people authored Aug 14, 2020
1 parent bb22929 commit b5888cc
Show file tree
Hide file tree
Showing 139 changed files with 7,637 additions and 4,348 deletions.
65 changes: 44 additions & 21 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ jobs:
- uses: actions/checkout@v1
- name: pre-commit
run: |
export TRAVIS_BUILD_DIR=${{ github.workspace }}
go generate ./...
go install ./...
pre-commit run -a --show-diff-on-failure
Expand Down Expand Up @@ -51,14 +52,17 @@ jobs:
docker run --rm -d --name=hive --net=host sqlflow/gohive:dev python3 -m http.server 8899
PYTHONPATH=${{ github.workspace }}/python scripts/test/hive.sh
# bash scripts/travis/upload_codecov.sh
- name: maxcompute unit test
run: |
set -e
source build/env/bin/activate
export SQLFLOW_TEST_DB_MAXCOMPUTE_AK=$MAXCOMPUTE_AK
export SQLFLOW_TEST_DB_MAXCOMPUTE_SK=$MAXCOMPUTE_SK
PYTHONPATH=${{ github.workspace }}/python bash scripts/test/maxcompute.sh
# bash scripts/travis/upload_codecov.sh
# FIXME(typhoonzero): maxcompute test often fails because connection timeout, we can add a
# maxcompute service close to the CI server and add this test back.
# - name: maxcompute unit test
# env:
# SQLFLOW_TEST_DB_MAXCOMPUTE_AK: ${{ secrets.MAXCOMPUTE_AK }}
# SQLFLOW_TEST_DB_MAXCOMPUTE_SK: ${{ secrets.MAXCOMPUTE_SK }}
# run: |
# set -e
# source build/env/bin/activate
# PYTHONPATH=${{ github.workspace }}/python bash scripts/test/maxcompute.sh
# # bash scripts/travis/upload_codecov.sh
- name: java unit test
run: |
set -e
Expand All @@ -77,13 +81,17 @@ jobs:
set -e
bash scripts/test/prepare.sh
source build/env/bin/activate
# build sqlflow binaries under build/
bash docker/dev/build.sh
docker pull sqlflow/sqlflow:step
docker build --cache-from sqlflow/sqlflow:step -t sqlflow/sqlflow:step --build-arg FIND_FASTED_MIRROR="false" -f docker/step/Dockerfile .
bash scripts/test/workflow.sh
# bash scripts/travis/upload_codecov.sh
push:
runs-on: ubuntu-latest
push-images:
runs-on: [self-hosted, linux]
needs: [test-mysql, test-hive-java, test-workflow]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v1
- uses: olegtarasov/get-tag@v2
id: tagName
- if: ${{ github.event_name == 'schedule' }}
Expand All @@ -94,16 +102,6 @@ jobs:
run: echo "::set-env name=TRAVIS_BRANCH::${{ github.head_ref }}"
- if: ${{ github.event_name == 'push' }}
run: echo "::set-env name=TRAVIS_BRANCH::${GITHUB_REF##*/}"
- name: release latest linux client binary
env:
TRAVIS_OS_NAME: linux
QINIU_AK: ${{ secrets.QINIU_AK }}
QINIU_SK: ${{ secrets.QINIU_SK }}
run: |
export TRAVIS_BUILD_DIR=${{ github.workspace }}
export TRAVIS_TAG=${{ steps.tagName.outputs.tag }}
export TRAVIS_PULL_REQUEST=${{ github.event.number }}
bash scripts/travis/deploy_client.sh
- name: push server images
env:
DOCKER_USERNAME: "typhoon1986"
Expand All @@ -117,6 +115,31 @@ jobs:
export FIND_FASTED_MIRROR=false
export TRAVIS_BUILD_STAGE_NAME=Deploy
bash scripts/travis/deploy_docker.sh
linux-client:
runs-on: ubuntu-latest
needs: [test-mysql, test-hive-java, test-workflow]
steps:
- uses: actions/checkout@v2
- uses: olegtarasov/get-tag@v2
id: tagName
- if: ${{ github.event_name == 'schedule' }}
run: |
echo "::set-env name=TRAVIS_EVENT_TYPE::cron"
echo "::set-env name=TRAVIS_BRANCH::${GITHUB_REF##*/}"
- if: ${{ github.event_name == 'pull_request' }}
run: echo "::set-env name=TRAVIS_BRANCH::${{ github.head_ref }}"
- if: ${{ github.event_name == 'push' }}
run: echo "::set-env name=TRAVIS_BRANCH::${GITHUB_REF##*/}"
- name: release latest linux client binary
env:
TRAVIS_OS_NAME: linux
QINIU_AK: ${{ secrets.QINIU_AK }}
QINIU_SK: ${{ secrets.QINIU_SK }}
run: |
export TRAVIS_BUILD_DIR=${{ github.workspace }}
export TRAVIS_TAG=${{ steps.tagName.outputs.tag }}
export TRAVIS_PULL_REQUEST=${{ github.event.number }}
bash scripts/travis/deploy_client.sh
# TODO(typhoonzero): remove travis envs when we have moved to github actions completely
macos-client:
runs-on: macos-latest
Expand Down
139 changes: 0 additions & 139 deletions .travis.yml

This file was deleted.

2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# SQLFlow

[![Build Status](https://travis-ci.com/sql-machine-learning/sqlflow.svg?branch=develop)](https://travis-ci.com/sql-machine-learning/sqlflow)
![CI](https://github.com/sql-machine-learning/sqlflow/workflows/CI/badge.svg)
[![codecov](https://codecov.io/gh/sql-machine-learning/sqlflow/branch/develop/graph/badge.svg)](https://codecov.io/gh/sql-machine-learning/sqlflow)
[![GoDoc](https://godoc.org/github.com/sql-machine-learning/sqlflow?status.svg)](https://godoc.org/github.com/sql-machine-learning/sqlflow)
[![License](https://img.shields.io/badge/license-Apache%202-blue.svg)](LICENSE)
Expand Down
Loading

0 comments on commit b5888cc

Please sign in to comment.