Skip to content
This repository was archived by the owner on Sep 29, 2020. It is now read-only.

Test action megathread (issue) #1

Open
neverendingqs opened this issue Aug 22, 2020 · 61 comments
Open

Test action megathread (issue) #1

neverendingqs opened this issue Aug 22, 2020 · 61 comments

Comments

@neverendingqs
Copy link
Owner

neverendingqs commented Aug 22, 2020

Use this issue to test the ask-so workflow.

Usage:

/so <query>
@neverendingqs
Copy link
Owner Author

Creating a comment.

@neverendingqs
Copy link
Owner Author

Another comment

@neverendingqs
Copy link
Owner Author

3

@neverendingqs
Copy link
Owner Author

4

@neverendingqs neverendingqs changed the title Test action megathread Test action megathread (issue) Aug 22, 2020
@neverendingqs
Copy link
Owner Author

5

@neverendingqs
Copy link
Owner Author

6

@neverendingqs
Copy link
Owner Author

7

@neverendingqs
Copy link
Owner Author

8

@neverendingqs
Copy link
Owner Author

9

@github-actions
Copy link
Contributor

Received!

@neverendingqs
Copy link
Owner Author

/so github actions

@github-actions
Copy link
Contributor

Received query 'github actions!'

@neverendingqs
Copy link
Owner Author

10

@neverendingqs
Copy link
Owner Author

/so github actions

@github-actions
Copy link
Contributor

github-actions bot commented Aug 22, 2020

github%20actions

How to run GitHub Actions workflows locally?

Answers

Github Actions Angular-cli

Answers answer 1 --- answer 2

Github Actions v4 API?

Answers

@neverendingqs
Copy link
Owner Author

/so github actions

@github-actions
Copy link
Contributor

Results for github actions

How to run GitHub Actions workflows locally?

Answers

Github Actions Angular-cli

Answers

Github Actions v4 API?

Answers

@neverendingqs
Copy link
Owner Author

/so github actions

@github-actions
Copy link
Contributor

Results for github actions

How to run GitHub Actions workflows locally?

Answers

By iirekm (Votes: 15)

There are tools like the already-mentioned act, but they are not perfect. You are not alone with this issue. Similar problems are:

  • how to test Jenkins builds locally
  • how to test GitLab CI builds locally
  • how to test Circle CI builds locally
  • how to test XXXX builds locally

And my solution for these problems is:

  • avoid functionalities provided by your CI tools (GitHub Actions, Gitlab CI, etc)
  • write as much as possible in CI-agnostic way (BASH scripts, PowerShell scripts, Gradle scripts, NPM scripts, Dockerfiles, Ansible scripts - anything you know)
  • invoke those scripts from your CI tool. In GitHub actions: run: your command to run

,### By Webber (Votes: 9)

You can use nektos/act which supports yaml syntax since 0.2.0 (prerelease).

Check out their latest release.

,### By Jubair (Votes: 3)

your best bet is https://github.com/nektos/act however it doesn't support yaml syntax yet, though there is a lot of interest aka: nektos/act#80 nektos/act#76 and nektos/act#74

Gitlab has gitlab-runner exec docker job-name but that's Gitlab :)

Github Actions Angular-cli

Answers

By rams (Votes: 2)

you seems to be new in Github Actions and deployments. With my experience, I assume you have reached the point of install Angular-CLI, due to ng not found issues occurred in action flow.

- uses: actions/checkout@v1
- name: Install Node
  uses: actions/setup-node@v1
     with:
        node-version: 12.8
- name: npm dependencies
   run: npm install     
- name: Build
   run: npm run build -- --prod

Fix Detail: Install Node first and then try npm install and npm build

Github Actions v4 API?

Answers

By VonC (Votes: 1)

Not yet: the GitHub Actions API announced last January 2020 is:

Note that there is a difference between:

For now, Actions are exposed through REST, not GraphQL

The request has been made of course:

As of now, the GitHub Actions API enables you to manage GitHub Actions only using the REST API.
I'll be sure to pass on your request for v4 support to the team for consideration.

(that was mid-April 2020)

@neverendingqs
Copy link
Owner Author

/so github actions

@github-actions
Copy link
Contributor

Results for github actions

How to run GitHub Actions workflows locally?

I am planning to move our Travis CI build to GitHub Actions using Docker for our per-commit testing.

Can I reproducibly run these new GitHub Actions workflows locally? Is there a generic way to run any GitHub Actions workflow locally?

Answers

By iirekm (Votes: 15)

There are tools like the already-mentioned act, but they are not perfect. You are not alone with this issue. Similar problems are:

  • how to test Jenkins builds locally
  • how to test GitLab CI builds locally
  • how to test Circle CI builds locally
  • how to test XXXX builds locally

And my solution for these problems is:

  • avoid functionalities provided by your CI tools (GitHub Actions, Gitlab CI, etc)
  • write as much as possible in CI-agnostic way (BASH scripts, PowerShell scripts, Gradle scripts, NPM scripts, Dockerfiles, Ansible scripts - anything you know)
  • invoke those scripts from your CI tool. In GitHub actions: run: your command to run

By Webber (Votes: 9)

You can use nektos/act which supports yaml syntax since 0.2.0 (prerelease).

Check out their latest release.

By Jubair (Votes: 3)

your best bet is https://github.com/nektos/act however it doesn't support yaml syntax yet, though there is a lot of interest aka: nektos/act#80 nektos/act#76 and nektos/act#74

Gitlab has gitlab-runner exec docker job-name but that's Gitlab :)

Github Actions Angular-cli

A project has been created and placed on Github. I was trying to explore the Github Actions, with building Angular-cli projects.

The .yml file for githubAction is as follows,

  steps:
        - uses: actions/checkout@v1
        - name: Install NPM
          run:  npm install
        - name: Update Npm
          run:  npm update
        - name: Typescript compiler
          uses: iCrawl/action-tsc@v1
        - name: NpM angular CLI 
          uses: npm install angular-cli
        - name: Build
          run:  npm build

Then while building gets the following error,

The pipeline is not valid. .github/workflows/main.yml (Line: 19, Col: 13): Expected format {org}/{repo}[/path]@ref. Actual 'npm install angular-cli',Input string was not in a correct format.
Answers

By rams (Votes: 2)

you seems to be new in Github Actions and deployments. With my experience, I assume you have reached the point of install Angular-CLI, due to ng not found issues occurred in action flow.

- uses: actions/checkout@v1
- name: Install Node
  uses: actions/setup-node@v1
     with:
        node-version: 12.8
- name: npm dependencies
   run: npm install     
- name: Build
   run: npm run build -- --prod

Fix Detail: Install Node first and then try npm install and npm build

Github Actions v4 API?

I need to interact with Github Actions through some code. I found the v3 API for that, but I've been migrating code to use the v4 API instead. I have not been able to find anything related to github actions on v4, unfortunately. Using introspection won't show "artifacts" or "workflows", except for copying the latter on CloneProjectInput, but, perhaps, there's a feature flag I need to use to get them.

Is there any way to interact with Github Actions through their v4 API?

Answers

By VonC (Votes: 1)

Not yet: the GitHub Actions API announced last January 2020 is:

Note that there is a difference between:

For now, Actions are exposed through REST, not GraphQL

The request has been made of course:

As of now, the GitHub Actions API enables you to manage GitHub Actions only using the REST API.
I'll be sure to pass on your request for v4 support to the team for consideration.

(that was mid-April 2020)

@neverendingqs
Copy link
Owner Author

/so git rebase interactive

@github-actions
Copy link
Contributor

Results for git rebase interactive

How can I merge two commits into one if I already started rebase?

I am trying to merge 2 commits into 1, so I followed “squashing commits with rebase” from git ready.

I ran

git rebase --interactive HEAD~2

In the resulting editor, I change pick to squash and then save-quit, but the rebase fails with the error

Cannot 'squash' without a previous commit

Now that my work tree has reached this state, I’m having trouble recovering.

The command git rebase --interactive HEAD~2 fails with:

Interactive rebase already started

and git rebase --continue fails with

Cannot 'squash' without a previous commit

Answers

By Greg Bacon (Votes: 1760)

Summary

The error message

Cannot 'squash' without a previous commit

means you likely attempted to “squash downward.” Git always squashes a newer commit into an older commit or “upward” as viewed on the interactive rebase todo list, that is into a commit on a previous line. Changing the command on your todo list’s very first line to squash will always produce this error as there is nothing for the first commit to squash into.

The Fix

First get back to where you started with

$ git rebase --abort

Say your history is

$ git log --pretty=oneline
a931ac7c808e2471b22b5bd20f0cad046b1c5d0d c
b76d157d507e819d7511132bdb5a80dd421d854f b
df239176e1a2ffac927d8b496ea00d5488481db5 a

That is, a was the first commit, then b, and finally c. After committing c we decide to squash b and c together:

(Note: Running git log pipes its output into a pager, less by default on most platforms. To quit the pager and return to your command prompt, press the q key.)

Running git rebase --interactive HEAD~2 gives you an editor with

pick b76d157 b
pick a931ac7 c

# Rebase df23917..a931ac7 onto df23917
#
# Commands:
#  p, pick = use commit
#  r, reword = use commit, but edit the commit message
#  e, edit = use commit, but stop for amending
#  s, squash = use commit, but meld into previous commit
#  f, fixup = like "squash", but discard this commit's log message
#
# If you remove a line here THAT COMMIT WILL BE LOST.
# However, if you remove everything, the rebase will be aborted.
#

(Notice that this todo list is in the reverse order as compared with the output of git log.)

Changing b’s pick to squash will result in the error you saw, but if instead you squash c into b (newer commit into the older or “squashing upward”) by changing the todo list to

pick   b76d157 b
squash a931ac7 c

and save-quitting your editor, you'll get another editor whose contents are

# This is a combination of 2 commits.
# The first commit's message is:

b

# This is the 2nd commit message:

c

When you save and quit, the contents of the edited file become commit message of the new combined commit:

$ git log --pretty=oneline
18fd73d3ce748f2a58d1b566c03dd9dafe0b6b4f b and c
df239176e1a2ffac927d8b496ea00d5488481db5 a

Note About Rewriting History

Interactive rebase rewrites history. Attempting to push to a remote that contains the old history will fail because it is not a fast-forward.

If the branch you rebased is a topic or feature branch in which you are working by yourself, no big deal. Pushing to another repository will require the --force option, or alternatively you may be able, depending on the remote repository’s permissions, to first delete the old branch and then push the rebased version. Examples of those commands that will potentially destroy work is outside the scope of this answer.

Rewriting already-published history on a branch in which you are working with other people without very good reason such as leaking a password or other sensitive details forces work onto your collaborators and is antisocial and will annoy other developers. The “Recovering From an Upstream Rebase” section in the git rebase documentation explains, with added emphasis.

Rebasing (or any other form of rewriting) a branch that others have based work on is a bad idea: anyone downstream of it is forced to manually fix their history. This section explains how to do the fix from the downstream’s point of view. The real fix, however, would be to avoid rebasing the upstream in the first place.

By user3828059 (Votes: 418)

If there are multiple commits, you can use git rebase -i to squash two commits into one.

If there are only two commits you want to merge, and they are the "most recent two", the following commands can be used to combine the two commits into one:

git reset --soft "HEAD^"
git commit --amend

By pambda (Votes: 119)

Rebase: You Ain't Gonna Need It:

A simpler way for most frequent scenario.

In most cases:

Actually if all you want is just simply combine several recent commits into one but do not need drop, reword and other rebase work.

you can simply do:

git reset --soft "HEAD~n"
  • Assuming ~n is number of commits to softly un-commit (i.e. ~1, ~2,...)

Then, use following command to modify the commit message.

git commit --amend

which is pretty much the same as a long range of squash and one pick.

And it works for n commits but not just two commits as above answer prompted.

Git - How to fix "corrupted" interactive rebase?

I managed to create a little mess in my local git repository. I was trying to fix a broken commit by using the following instructions. Before running the "git commit --amend" (and after the git rebase --interactive) I decided that my changes were incorrect and so I executed "git reset HEAD --hard". Not a good idea, I tell you.

Now the interactive rebase seems to be "stuck". Git shows the current branch as (|REBASE-m). Every command (cd .., ls, git rebase...) inside my repository gives the following error:

cat: .git/rebase-merge/head-name: No such file or directory

Here's how git rebase --abort looks like:

$ git rebase --abort
cat: c:/_work/project/src/git/.git/rebase-merge/quiet: No such file or directory
cat: c:/_work/project/src/git/.git/rebase-merge/head-name: No such file or directory
cat: c:/_work/project/src/git/.git/rebase-merge/orig-head: No such file or directory
HEAD is now at 4c737fb Revert "Modified file names"
rm: cannot remove `c:/_work/project/src/git/.git/rebase-merge/done': Permission denied
rm: cannot remove directory `c:/_work/project/src/git/.git/rebase-merge': Directory
not empty
cat: .git/rebase-merge/head-name: No such file or directory

Here's the result of git rebase --continue:

$ git rebase --continue
cat: c:/_work/project/src/git/.git/rebase-merge/prev_head: No such file or directory
cat: c:/_work/project/src/git/.git/rebase-merge/end: No such file or directory
cat: c:/_work/project/src/git/.git/rebase-merge/msgnum: No such file or directory
cat: c:/_work/project/src/git/.git/rebase-merge/onto: No such file or directory
cat: c:/_work/project/src/git/.git/rebase-merge/quiet: No such file or directory
prev_head must be defined
cat: .git/rebase-merge/head-name: No such file or directory

Any ideas? I would like to reset the situation back to the state it was before I started my well-thought rebase operation.

Here's how git log --oneline shows the situation:

4c737fb Revert "Modified file names"
247ac02 Modified file names
33141e6 Message modifications
10a4a04 Modified db script

And this is fine.

I'm using msysgit v1.7.0.2.

Answers

By Laura Slocum (Votes: 218)

I got stuck in this. I created the head-name file, and then I ran into another error saying it couldn't find the onto file, so I created that file. Then I got another error saying could not read '.git/rebase-apply/onto': No such file or directory.

So I looked at the git documentation for rebasing and found another command:

git rebase --quit

This set me back on my branch with no changes, and I could start my rebase over again, good as new.

By Martin Owen (Votes: 163)

It looks like Git tried to remove the .git/rebase-merge directory but wasn't able to remove it completely. Have you tried copying that folder away? Also copy away the .git/rebase-apply folder if that is present.

By user584317 (Votes: 90)

I had a similar problem due to a zombie vim.exe process. Killing it in Task Manager, followed by a git rebase --abort fixed it.

How to abort a git rebase from inside vim during interactive editing

When I do an interactive rebase, e.g.

git rebase -i HEAD~3

the rebase interactive editor (vim in my case) opens to let me edit the commits to rebase

pick c843ea2 Set Vim column limit to 80 (OS X)
pick fc32eac Add Bash alias for `pbcopy` (OS X)
....

If I now decide that I want to abort the rebase and quit vim using :q the rebase starts anyway. I'm using git version 1.9.0.msysgit.0 on windows.

Sure I can just delete all pick lines, but it might be a lot to do if I rebase a longer history. Is there another way?

How can I quit the rebase interactive editor (vim) and abort the rebase?

Answers

By Telmo Costa (Votes: 258)

If you exit the editor with an error code, the rebase will be aborted.

To exit with an error code on vim, do

:cq

See here and vimdoc here.

By Steven Penny (Votes: 44)

Just delete all the lines that are not comments from the file. Then save it to the default supplied path and exit the editor.

As a result git will just do nothing in this rebase.

To delete all lines in vim you can use

:%d|x

Then save and quit.

delete all lines of file in Vim

How to exit the Vim editor?

VIM - multiple commands on same line

By CodeManX (Votes: 9)

If you're using Notepad++ on Windows for instance:

CtrlA to select everything, then Del or Backspace to delete and CtrlS save. Git will abort the rebase if the file is empty.

You may also hit CtrlC in the command prompt where git is running to stop the current rebase command. Then run git rebase --abort to revert it.

@neverendingqs
Copy link
Owner Author

/so git reset

@github-actions
Copy link
Contributor

Results for git reset

Reset local repository branch to be just like remote repository HEAD

Question

How do I reset my local branch to be just like the branch on the remote repository?

I did:

git reset --hard HEAD

But when I run a git status,

On branch master
Changes to be committed:
  (use "git reset HEAD <file>..." to unstage)
      modified:   java/com/mycompany/TestContacts.java
      modified:   java/com/mycompany/TestParser.java

Can you please tell me why I have these 'modified'? I haven't touched these files? If I did, I want to remove those.

Answers

By Dan Moulding (Votes: 6947)

Setting your branch to exactly match the remote branch can be done in two steps:

git fetch origin
git reset --hard origin/master

If you want to save your current branch's state before doing this (just in case), you can do:

git commit -a -m "Saving my work, just in case"
git branch my-saved-work

Now your work is saved on the branch "my-saved-work" in case you decide you want it back (or want to look at it later or diff it against your updated branch).

Note that the first example assumes that the remote repo's name is "origin" and that the branch named "master" in the remote repo matches the currently checked-out branch in your local repo.

BTW, this situation that you're in looks an awful lot like a common case where a push has been done into the currently checked out branch of a non-bare repository. Did you recently push into your local repo? If not, then no worries -- something else must have caused these files to unexpectedly end up modified. Otherwise, you should be aware that it's not recommended to push into a non-bare repository (and not into the currently checked-out branch, in particular).

By Akavall (Votes: 446)

I needed to do (the solution in the accepted answer):

git fetch origin
git reset --hard origin/master

Followed by:

git clean -f

to remove local files

To see what files will be removed (without actually removing them):

git clean -n -f

By Acumenus (Votes: 323)

First, reset to the previously fetched HEAD of the corresponding upstream branch:

git reset --hard @{u}

The advantage of specifying @{u} or its verbose form @{upstream} is that the name of the remote repo and branch don't have to be explicitly specified. On Windows or with PowerShell, specify "@{u}" (with double quotes).

Next, as needed, remove untracked files, optionally also with -x:

git clean -df

Finally, as needed, get the latest changes:

git pull

Delete commits from a branch in Git

Question

I would like to know how to delete a commit.

By delete, I mean it is as if I didn't make that commit, and when I do a push in the future, my changes will not push to the remote branch.

I read git help, and I think the command I should use is git reset --hard HEAD. Is this correct?

Answers

By gahooa (Votes: 4216)

Careful: git reset --hard WILL DELETE YOUR WORKING DIRECTORY CHANGES. Be sure to stash any local changes you want to keep before running this command.

Assuming you are sitting on that commit, then this command will wack it...

git reset --hard HEAD~1

The HEAD~1 means the commit before head.

Or, you could look at the output of git log, find the commit id of the commit you want to back up to, and then do this:

git reset --hard <sha1-commit-id>

If you already pushed it, you will need to do a force push to get rid of it...

git push origin HEAD --force

However, if others may have pulled it, then you would be better off starting a new branch. Because when they pull, it will just merge it into their work, and you will get it pushed back up again.

If you already pushed, it may be better to use git revert, to create a "mirror image" commit that will undo the changes. However, both commits will be in the log.


FYI -- git reset --hard HEAD is great if you want to get rid of WORK IN PROGRESS. It will reset you back to the most recent commit, and erase all the changes in your working tree and index.


Lastly, if you need to find a commit that you "deleted", it is typically present in git reflog unless you have garbage collected your repository.

By Greg Hewgill (Votes: 717)

If you have not yet pushed the commit anywhere, you can use git rebase -i to remove that commit. First, find out how far back that commit is (approximately). Then do:

git rebase -i HEAD~N

The ~N means rebase the last N commits (N must be a number, for example HEAD~10). Then, you can edit the file that Git presents to you to delete the offending commit. On saving that file, Git will then rewrite all the following commits as if the one you deleted didn't exist.

The Git Book has a good section on rebasing with pictures and examples.

Be careful with this though, because if you change something that you have pushed elsewhere, another approach will be needed unless you are planning to do a force push.

By 1800 INFORMATION (Votes: 527)

Another possibility is one of my personal favorite commands:

git rebase -i <commit>~1

This will start the rebase in interactive mode -i at the point just before the commit you want to whack. The editor will start up listing all of the commits since then. Delete the line containing the commit you want to obliterate and save the file. Rebase will do the rest of the work, deleting only that commit, and replaying all of the others back into the log.

How to undo 'git reset'?

Question

What's the simplest way to undo the

git reset HEAD~

command? Currently, the only way I can think of is doing a "git clone http://..." from a remote repo.

Answers

By Mark Lodato (Votes: 2451)

Short answer:

git reset 'HEAD@{1}'

Long answer:

Git keeps a log of all ref updates (e.g., checkout, reset, commit, merge). You can view it by typing:

git reflog

Somewhere in this list is the commit that you lost. Let's say you just typed git reset HEAD~ and want to undo it. My reflog looks like this:

$ git reflog
3f6db14 HEAD@{0}: HEAD~: updating HEAD
d27924e HEAD@{1}: checkout: moving from d27924e0fe16776f0d0f1ee2933a0334a4787b4c
[...]

The first line says that HEAD 0 positions ago (in other words, the current position) is 3f6db14; it was obtained by resetting to HEAD~. The second line says that HEAD 1 position ago (in other words, the state before the reset) is d27924e. It was obtained by checking out a particular commit (though that's not important right now). So, to undo the reset, run git reset HEAD@{1} (or git reset d27924e).

If, on the other hand, you've run some other commands since then that update HEAD, the commit you want won't be at the top of the list, and you'll need to search through the reflog.

One final note: It may be easier to look at the reflog for the specific branch you want to un-reset, say master, rather than HEAD:

$ git reflog show master
c24138b master@{0}: merge origin/master: Fast-forward
90a2bf9 master@{1}: merge origin/master: Fast-forward
[...]

This should have less noise it in than the general HEAD reflog.

By Dan Fischer (Votes: 205)

Old question, and the posted answers work great. I'll chime in with another option though.

git reset ORIG_HEAD

ORIG_HEAD references the commit that HEAD previously referenced.

By zainengineer (Votes: 57)

My situation was slightly different, I did git reset HEAD~ three times.

To undo it I had to do

git reset HEAD@{3}

so you should be able to do

git reset HEAD@{N}

But if you have done git reset using

git reset HEAD~3

you will need to do

git reset HEAD@{1}

{N} represents the number of operations in reflog, as Mark pointed out in the comments.

@neverendingqs
Copy link
Owner Author

/so git one line graph

@github-actions
Copy link
Contributor

Results for git one line graph

Why does my git log graph grow one more line after I merge a branch each time?

Question

I'am used to use git log --oneline --graph --decorate --all as alias git ll to see the graph of commits in terminal

But a problem confuse me when every single time I merge my develop to master. The output of the command above may be like this:

* 0d1bf7b (HEAD -> master) Fix typo
*   f843224 Merge 'develop' to 'master'
|\
* | d673b76 (origin/master) Remove console.log for license information
* | 5080afc Remove all http url in production
* |   f28e74b Merge branch 'develop'
|\ \
* \ \   75c5b90 Merge branch 'develop'
|\ \ \
* \ \ \   ec189e6 Merge branch 'develop'
|\ \ \ \
* \ \ \ \   eb79c75 Merge branch 'develop'
|\ \ \ \ \
* \ \ \ \ \   74631ef Merge branch 'develop'
|\ \ \ \ \ \
| | | | | | | * f7a4155 (light) Fix typo
| | | | | | | *   1d6c411 Merge 'develop' to 'light'
| | | | | | | |\
| | | | | | | |/
| | | | | | |/|
| | | | | | * | 3715f47 (develop) Finish GroupCard in Setting page
| | | | | | | * e606e68 (origin/light) Remove console.log for license information
| | | | | | | * 676774c Remove all http url in production
| | | | | | | * c1bef16 Fix api url error

You can see there are too many lines generated after I merge develop to master. It is not a big problem for now, but it will become too many lines too obstruct me to see the commits someday.

So is there any thing I do wrong? Have you guys ever faced kind of this problem? How do you handle it?


[Editing on 2019/05/20]

Thank you guys. I appreciate your kind answers.

I want to fix my problem and make it clear a little bit. I use some GUI tools like Source tree and it show the git log as below. There are not many complicated lines with the same repository in this graph as you can see.

So is it possible if I want to show the graph like it in my command line interface?

git log in Source tree

Answers

By VonC (Votes: 2)

That is why squash and rebase do exists (for local commits of develop you have not pushed yet).
That would help keep the history linear, instead of git log showing you each develop merge in its separate track.

So is it possible if I want to show the graph like it in my command line interface?

In command-line, you can avoid all those extra lines by adding --no-merges:

git log --decorate --oneline --graph --no-merges --all --branches

By Rajni Kewlani (Votes: 0)

Of course you should use rebase and squash wherever applicable. Additionally, you can try following --no-merges option like:

git log --oneline --graph --decorate --all --no-merges

How to remove (squash) unnamed git branches

Question

I have a git repo with an absurdly complex tree. I want to simplify that tree so that all the unnamed branches will be a simple commit and just one line remains in the tree.

I have tried to:

  1. remove commits by interactive rebase
  2. squash all the commits in the unnamed branches

In both cases, I have issues with submodules conflicts. The repo over which I am working is this one: link.

For future reference, I leave here the present status of the git graph log:

*   9b0bc07 - (HEAD -> master, origin/master, origin/HEAD) Merge branch 'master' of https://framagit.org/sapo/ph.d.-project (16 hours ago) 
|\  
| * df5584b - Finished methods description and analysis (18 hours ago) 
| * 665d714 - Update comparison/README.md (20 hours ago) 
| * 6e31ac5 - Update comparison/README.md (5 days ago) 
| * b33094e - Update README.md (5 days ago) 
| *   336f5ad - Resolved merge (6 days ago) 
| |\  
| | * 1fba1c9 - Reorganized readme (7 days ago) 
| | * 12a313c - Still debugging, problem with polyphony (9 days ago) 
| | * 744ad4b - Still debugging runPlyAlignment (10 days ago) 
| | * 429fd13 - Added few modules. polyAlignment should work now, to be tested (13 days ago) 
| | *   25a83e6 - Finished broad_alingment, doing precise (13 days ago) 
| | |\  
| | | * 7a26990 - Update README.md (2 weeks ago) 
| | | * ccf7532 - Now tracking edited AMPACT code (3 weeks ago) 
| | | * 67a7d31 - Added configuration file for managing experiments (3 weeks ago) 
| | | * f2dafb9 - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| | | * 16c4fe0 - Update comparison/README.md (4 weeks ago) 
| | | * 2983292 - updated datasets (4 weeks ago) 
| | | * 245f36e - Added symbolic link to utils (5 weeks ago) 
| | | * 202ee49 - Removed double submodule utils (5 weeks ago) 
| | | * 6a7ca53 - Updating  datasets (5 weeks ago) 
| | | * 0b25e03 - Trying to solve submodule issue (5 weeks ago) 
| | | * 96b4d90 - Moved utils to submodule (5 weeks ago) 
| | | * d4e7dc2 - Moved datasets to submodule (5 weeks ago) 
| | | * 667492c - Add LICENSE (5 weeks ago) 
| | | * 3882ad0 - Initial commit (5 weeks ago) 
| | * 253c3ea - Now tracking edited AMPACT code (3 weeks ago) 
| | * 022d483 - Added configuration file for managing experiments (3 weeks ago) 
| | * 557daba - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| | * 434e3ae - Update comparison/README.md (4 weeks ago) 
| | * 66bc782 - updated datasets (4 weeks ago) 
| | * 1099294 - Added symbolic link to utils (5 weeks ago) 
| | * 7b43ae8 - Removed double submodule utils (5 weeks ago) 
| | * ae824ac - Updating  datasets (5 weeks ago) 
| | * 37dceed - Trying to solve submodule issue (5 weeks ago) 
| | * 2d83417 - Moved utils to submodule (5 weeks ago) 
| | * f11aa29 - Moved datasets to submodule (5 weeks ago) 
| | * 24a6b04 - Add LICENSE (5 weeks ago) 
| | * dd1e88d - Initial commit (5 weeks ago) 
| * a342f5e - Start tracking midiVelocity code and added dependencies for toyExampleRunScript (6 days ago) 
| * f2f8c48 - Still debugging, problem with polyphony (9 days ago) 
| * f7ede7a - Still debugging runPlyAlignment (10 days ago) 
| * 8f3b455 - Added few modules. polyAlignment should work now, to be tested (13 days ago) 
| *   415431e - Finished broad_alingment, doing precise (13 days ago) 
| |\  
| | * a6907a7 - Update README.md (2 weeks ago) 
| | * ab075bc - Now tracking edited AMPACT code (3 weeks ago) 
| | * 7f41e99 - Added configuration file for managing experiments (3 weeks ago) 
| | * cf925d6 - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| | * eb2ffa4 - Update comparison/README.md (4 weeks ago) 
| | * a6b3992 - updated datasets (4 weeks ago) 
| | * 3f28358 - Added symbolic link to utils (5 weeks ago) 
| | * fb8c1db - Removed double submodule utils (5 weeks ago) 
| | * 99c6b86 - Updating  datasets (5 weeks ago) 
| | * 63646c3 - Trying to solve submodule issue (5 weeks ago) 
| | * 08cdad7 - Moved utils to submodule (5 weeks ago) 
| | * dc270c3 - Moved datasets to submodule (5 weeks ago) 
| | * 23a7cae - Add LICENSE (5 weeks ago) 
| | * 8aa959d - Initial commit (5 weeks ago) 
| * 95c1561 - Now tracking edited AMPACT code (3 weeks ago) 
| * 755842e - Added configuration file for managing experiments (3 weeks ago) 
| * 87ef237 - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| * ad7de96 - Update comparison/README.md (4 weeks ago) 
| * e8306d8 - updated datasets (4 weeks ago) 
| * 6c56644 - Added symbolic link to utils (5 weeks ago) 
| * 83dc5e5 - Removed double submodule utils (5 weeks ago) 
| * 4d3480a - Updating  datasets (5 weeks ago) 
| * cd17adb - Trying to solve submodule issue (5 weeks ago) 
| * 8cee7fa - Moved utils to submodule (5 weeks ago) 
| * 51eb100 - Moved datasets to submodule (5 weeks ago) 
| * 654ad09 - Add LICENSE (5 weeks ago) 
| * ffe8f95 - Initial commit (5 weeks ago) 
* cde3680 - Added ewert-mueller-synch-method (16 hours ago) 
* 26ed69f - Finished methods description and analysis (18 hours ago) 
* de98e01 - Update comparison/README.md (20 hours ago) 
* 3a2f81c - Update comparison/README.md (5 days ago) 
* 1bbaf02 - Update README.md (5 days ago) 
*   08e1719 - Resolved merge (6 days ago) 
|\  
| * 3442cf3 - Reorganized readme (7 days ago) 
| * c9a382e - Still debugging, problem with polyphony (9 days ago) 
| * 6eeaa55 - Still debugging runPlyAlignment (10 days ago) 
| * 9ee5b09 - Added few modules. polyAlignment should work now, to be tested (13 days ago) 
| *   72cc527 - Finished broad_alingment, doing precise (13 days ago) 
| |\  
| | * 1fa24ce - Update README.md (2 weeks ago) 
| | * 65d8ec3 - Now tracking edited AMPACT code (3 weeks ago) 
| | * 90c2fcd - Added configuration file for managing experiments (3 weeks ago) 
| | * 0dfab99 - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| | * 8adc63b - Update comparison/README.md (4 weeks ago) 
| | * 8f37d17 - updated datasets (4 weeks ago) 
| | * 2731ac6 - Added symbolic link to utils (5 weeks ago) 
| | * 7d3e966 - Removed double submodule utils (5 weeks ago) 
| | * 29486c3 - Updating  datasets (5 weeks ago) 
| | * 4da21a2 - Trying to solve submodule issue (5 weeks ago) 
| | * 113978b - Moved utils to submodule (5 weeks ago) 
| | * 147df61 - Moved datasets to submodule (5 weeks ago) 
| | * 0c49f6f - Add LICENSE (5 weeks ago) 
| | * b5edea7 - Initial commit (5 weeks ago) 
| * 647f169 - Now tracking edited AMPACT code (3 weeks ago) 
| * 06f03b8 - Added configuration file for managing experiments (3 weeks ago) 
| * d3da2bf - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| * f2056cc - Update comparison/README.md (4 weeks ago) 
| * ed2d32b - updated datasets (4 weeks ago) 
| * 5f76af1 - Added symbolic link to utils (5 weeks ago) 
| * a36d228 - Removed double submodule utils (5 weeks ago) 
| * ff56e6c - Updating  datasets (5 weeks ago) 
| * a8113f6 - Trying to solve submodule issue (5 weeks ago) 
| * 439bbe6 - Moved utils to submodule (5 weeks ago) 
| * f1900ac - Moved datasets to submodule (5 weeks ago) 
| * 4b95c5c - Add LICENSE (5 weeks ago) 
| * 4452ced - Initial commit (5 weeks ago) 
* f6cf1b3 - Start tracking midiVelocity code and added dependencies for toyExampleRunScript (6 days ago) 
* bcc62dc - Still debugging, problem with polyphony (9 days ago) 
* 2e35a6e - Still debugging runPlyAlignment (10 days ago) 
* 671a59e - Added few modules. polyAlignment should work now, to be tested (13 days ago) 
*   84228aa - Finished broad_alingment, doing precise (13 days ago) 
|\  
| * 1e96218 - Update README.md (2 weeks ago) 
| * 04a420f - Now tracking edited AMPACT code (3 weeks ago) 
| * 6fbf6cf - Added configuration file for managing experiments (3 weeks ago) 
| * 9ba12b1 - Error in Bach10 number of notes and loading sources (3 weeks ago) 
| * 0652851 - Update comparison/README.md (4 weeks ago) 
| * a5d23af - updated datasets (4 weeks ago) 
| * 750e8bd - Added symbolic link to utils (5 weeks ago) 
| * 48f2509 - Removed double submodule utils (5 weeks ago) 
| * cf92067 - Updating  datasets (5 weeks ago) 
| * 79b804b - Trying to solve submodule issue (5 weeks ago) 
| * df419cb - Moved utils to submodule (5 weeks ago) 
| * 5f616a4 - Moved datasets to submodule (5 weeks ago) 
| * 9e07d9b - Add LICENSE (5 weeks ago) 
| * 04d3e95 - Initial commit (5 weeks ago) 
* 175279a - Now tracking edited AMPACT code (3 weeks ago) 
* 79b209f - Added configuration file for managing experiments (3 weeks ago) 
* 1489f41 - Error in Bach10 number of notes and loading sources (3 weeks ago) 
* 0cfdded - Update comparison/README.md (4 weeks ago) 
* 83c0006 - updated datasets (4 weeks ago) 
* d9ec598 - Added symbolic link to utils (5 weeks ago) 
* 00429a4 - Removed double submodule utils (5 weeks ago) 
* bd2c942 - Updating  datasets (5 weeks ago) 
* 88d7305 - Trying to solve submodule issue (5 weeks ago) 
* d3a0519 - Moved utils to submodule (5 weeks ago) 
* 3dcb36e - Moved datasets to submodule (5 weeks ago) 
* 84462c3 - Add LICENSE (5 weeks ago) 
* 5e49895 - Initial commit (5 weeks ago) 

Answers

By harmonica141 (Votes: 2)

Looking at your linked repository I see why the question is arising. A PhD thesis on git is a cool thing to do but something entirely different from the development of a software project. Nonetheless partly the same rules apply. Having a longer history provides you with backups. I strongly recommend not taking those away as you may regret it at some later point in time... (telling from experience...)

Git as a version tracking tool exists for the sole purpose of keeping history. So squashing commits simply to make the tree less complex is the equivalent of burning history books.

To reduce complexity on the repository and restore maintainability you may want to refactor in order to implement a consistent branching scheme. There are several out there for any purpose and complexity of project. Maybe you have a look around in the git documentation or over at the Bitbucket guys from Atlassian.

A very popular branching model though is git flow. The basic idea is to have a continuous development branch that you merge new features (each being developed on a separate branch) into and from which releases emerge into a stable branch. Each feature branch is closed after the feature is finished so the number of open branches is naturally limited. Basically in this workflow branches are used to sort code by its stability, starting from separate feature branches that can not even live on their own up to an ever trustworthy stable release branch. Please note that there are helper scripts to do the hard work for you in this workflow. It makes cl work really easy.

It also is important to recognize bad software. If there is a automated tool creating wild unnamed branches without order and permission this is definitely a sign of bad software. Start taking git responsibility yourself and use something you can control. If you do not want to use the command line for some reasons then at least make use of a good gui tool like Gitkraken or Git Tower.

Take some time to find a branching model that suits your needs and then refactor the repository to enforce it. This will help you keep track of the ongoing work.

By the way, what you show in the picture is far from complex. It is just getting lengthy. But as work continues on a project you will accumulate quite a number of commits. This will become more extreme the more developers work on the same repository. There is no need or reason to shorten history, only to enforce law and order to keep it comprehensible.

Git commit option to split branches in the network graph

Question

On my git network graph, I want to keep branches separated. However if I have a circumstance where I split my master to branch A, then make one or more commits to A, then merge it back to master, the graph shows a single line for both master and A, despite the fact that at least one commit point was not included.

In other words, my graph looks like this:

*------*------*------* (master/A)

And I want it to look like this:

*------*------*------* (master/A)
 \__________________/

I know there's an option either in commit or push to force this (and I did it once, ages ago), but for the life of me I can't find it.

Does anyone know the command to do this? And second, for extra credit, its location in Android Studio?

Thanks!

Answers

By Scott (Votes: 6)

As Mykola said in a comment, the answer is:

git merge --no-ff

The normal behavior of a git merge is to "fast forward" the base branch HEAD up to the place where you are on your new branch. This effectively erases your new branch (A in the example above). Specifying "no Fast Forward" adds a new commit and preserves the existence of Branch A for posterity.

Also see: http://www.relativesanity.com/articles/ffwd

Bonus Answer: In Android Studio, it's possible to use the no-ff option. However you cannot use the quick branch menu on the bottom right where you select the branch and click merge. You need to go the "long" way - From the top menu, select VCS/Git/Merge Changes. This will give a dialog box for the merge allowing you to set options, including "No fast forward"

@neverendingqs
Copy link
Owner Author

/so git log isaccepted:no

@neverendingqs
Copy link
Owner Author

/so

@github-actions
Copy link
Contributor

Search anything on Stack Overflow using the /so command!

Usage: /so <query>

@neverendingqs
Copy link
Owner Author

/so sort [golang]

@github-actions
Copy link
Contributor

Results for sort [golang]

What is the shortest way to simply sort an array of structs by (arbitrary) field names?

Question

I just had a problem where I had an array of structs, e.g.

package main

import "log"

type Planet struct {
    Name       string  `json:"name"`
    Aphelion   float64 `json:"aphelion"`   // in million km
    Perihelion float64 `json:"perihelion"` // in million km
    Axis       int64   `json:"Axis"`       // in km
    Radius     float64 `json:"radius"`
}

func main() {
    var mars = new(Planet)
    mars.Name = "Mars"
    mars.Aphelion = 249.2
    mars.Perihelion = 206.7
    mars.Axis = 227939100
    mars.Radius = 3389.5

    var earth = new(Planet)
    earth.Name = "Earth"
    earth.Aphelion = 151.930
    earth.Perihelion = 147.095
    earth.Axis = 149598261
    earth.Radius = 6371.0

    var venus = new(Planet)
    venus.Name = "Venus"
    venus.Aphelion = 108.939
    venus.Perihelion = 107.477
    venus.Axis = 108208000
    venus.Radius = 6051.8

    planets := [...]Planet{*mars, *venus, *earth}
    log.Println(planets)
}

Lets say you want to sort it by Axis. How do you do that?

(Note: I have seen http://golang.org/pkg/sort/ and it seems to work, but I have to add about 20 lines just for simple sorting by a very simple key. I have a python background where it is as simple as sorted(planets, key=lambda n: n.Axis) - is there something similar simple in Go?)

Answers

By AndreKR (Votes: 321)

As of Go 1.8 you can now use sort.Slice to sort a slice:

sort.Slice(planets, func(i, j int) bool {
  return planets[i].Axis < planets[j].Axis
})

There is normally no reason to use an array instead of a slice, but in your example you are using an array, so you have to overlay it with a slice (add [:]) to make it work with sort.Slice:

sort.Slice(planets[:], func(i, j int) bool {
  return planets[i].Axis < planets[j].Axis
})

The sorting changes the array, so if you really want you can continue to use the array instead of the slice after the sorting.

By James Henstridge (Votes: 63)

UPDATE: This answer relates to older versions of go. For Go 1.8 and newer, see the AndreKR's answer below.


If you want something a bit less verbose than the standard library sort package, you could use the third party github.com/bradfitz/slice package. It uses some tricks to generate the Len and Swap methods needed to sort your slice, so you only need to provide a Less method.

With this package, you can perform the sort with:

slice.Sort(planets[:], func(i, j int) bool {
    return planets[i].Axis < planets[j].Axis
})

The planets[:] part is necessary to produce a slice covering your array. If you make planets a slice instead of an array you could skip that part.

By jimt (Votes: 37)

As of Go 1.8, @AndreKR's answer is the better solution.


You can implement a collection type which implements the sort interface.

Here's an example of two such types which allow you to sort either by Axis or Name:

package main

import "log"
import "sort"

// AxisSorter sorts planets by axis.
type AxisSorter []Planet

func (a AxisSorter) Len() int           { return len(a) }
func (a AxisSorter) Swap(i, j int)      { a[i], a[j] = a[j], a[i] }
func (a AxisSorter) Less(i, j int) bool { return a[i].Axis < a[j].Axis }

// NameSorter sorts planets by name.
type NameSorter []Planet

func (a NameSorter) Len() int           { return len(a) }
func (a NameSorter) Swap(i, j int)      { a[i], a[j] = a[j], a[i] }
func (a NameSorter) Less(i, j int) bool { return a[i].Name < a[j].Name }

type Planet struct {
    Name       string  `json:"name"`
    Aphelion   float64 `json:"aphelion"`   // in million km
    Perihelion float64 `json:"perihelion"` // in million km
    Axis       int64   `json:"Axis"`       // in km
    Radius     float64 `json:"radius"`
}

func main() {
    var mars Planet
    mars.Name = "Mars"
    mars.Aphelion = 249.2
    mars.Perihelion = 206.7
    mars.Axis = 227939100
    mars.Radius = 3389.5

    var earth Planet
    earth.Name = "Earth"
    earth.Aphelion = 151.930
    earth.Perihelion = 147.095
    earth.Axis = 149598261
    earth.Radius = 6371.0

    var venus Planet
    venus.Name = "Venus"
    venus.Aphelion = 108.939
    venus.Perihelion = 107.477
    venus.Axis = 108208000
    venus.Radius = 6051.8

    planets := []Planet{mars, venus, earth}
    log.Println("unsorted:", planets)

    sort.Sort(AxisSorter(planets))
    log.Println("by axis:", planets)

    sort.Sort(NameSorter(planets))
    log.Println("by name:", planets)
}

Sort Go map values by keys

Question

When iterating through the returned map in the code, returned by the topic function, the keys are not appearing in order.

How can I get the keys to be in order / sort the map so that the keys are in order and the values correspond?

Here is the code.

Answers

By Mingyu (Votes: 156)

The Go blog: Go maps in action has an excellent explanation.

When iterating over a map with a range loop, the iteration order is not specified and is not guaranteed to be the same from one iteration to the next. Since Go 1 the runtime randomizes map iteration order, as programmers relied on the stable iteration order of the previous implementation. If you require a stable iteration order you must maintain a separate data structure that specifies that order.

Here's my modified version of example code: http://play.golang.org/p/dvqcGPYy3-

package main

import (
"fmt"
"sort"
)

func main() {
// To create a map as input
m := make(map[int]string)
m[1] = "a"
m[2] = "c"
m[0] = "b"

// To store the keys in slice in sorted order
keys := make([]int, len(m))
i := 0
for k := range m {
    keys[i] = k
    i++
}
sort.Ints(keys)

// To perform the opertion you want
for _, k := range keys {
    fmt.Println(&quot;Key:&quot;, k, &quot;Value:&quot;, m[k])
}

}

Output:

Key: 0 Value: b
Key: 1 Value: a
Key: 2 Value: c

By joshlf (Votes: 18)

According to the Go spec, the order of iteration over a map is undefined, and may vary between runs of the program. In practice, not only is it undefined, it's actually intentionally randomized. This is because it used to be predictable, and the Go language developers didn't want people relying on unspecified behavior, so they intentionally randomized it so that relying on this behavior was impossible.

What you'll have to do, then, is pull the keys into a slice, sort them, and then range over the slice like this:

var m map[keyType]valueType
keys := sliceOfKeys(m) // you'll have to implement this
for _, k := range keys {
    v := m[k]
    // k is the key and v is the value; do your computation here
}

By Inanc Gumus (Votes: 14)

All of the answers here now contain the old behavior of maps. In Go 1.12+, you can just print a map value and it will be sorted by key automatically. This has been added because it allows the testing of map values easily.

func main() {
    m := map[int]int{3: 5, 2: 4, 1: 3}
    fmt.Println(m)
// In Go 1.12+
// Output: map[1:3 2:4 3:5]

// Before Go 1.12 (the order was undefined)
// map[3:5 2:4 1:3]

}

Maps are now printed in key-sorted order to ease testing. The ordering rules are:

  • When applicable, nil compares low
  • ints, floats, and strings order by <
  • NaN compares less than non-NaN floats
  • bool compares false before true
  • Complex compares real, then imaginary
  • Pointers compare by machine address
  • Channel values compare by machine address
  • Structs compare each field in turn
  • Arrays compare each element in turn
  • Interface values compare first by reflect.Type describing the concrete type and then by concrete value as described in the previous rules.

When printing maps, non-reflexive key values like NaN were previously displayed as <nil>. As of this release, the correct values are printed.

Read more here.

How to sort struct with multiple sort parameters?

Question

I have an array/slice of members:

type Member struct {
    Id int
    LastName string
    FirstName string
}

var members []Member

My question is how to sort them by LastName and then by FirstName.

Answers

By Muffin Top (Votes: 58)

Use the sort.Slice (available since Go 1.8) or the sort.Sort function to sort a slice of values.

With both functions, the application provides a function that tests if one slice element is less than another slice element. To sort by last name and then first name, compare last name and then first name:

if members[i].LastName < members[j].LastName {
    return true
}
if members[i].LastName > members[j].LastName {
    return false
}
return members[i].FirstName < members[j].FirstName

The less function is specified using an anonymous function with sort.Slice:

var members []Member
sort.Slice(members, func(i, j int) bool {
    if members[i].LastName < members[j].LastName {
        return true
    }
    if members[i].LastName > members[j].LastName {
        return false
    }
    return members[i].FirstName < members[j].FirstName
})

The less function is specified with through an interface with the sort.Sort function:

type byLastFirst []Member

func (members byLastFirst) Len() int           { return len(members) }
func (members byLastFirst) Swap(i, j int)      { members[i], members[j] = members[j], members[i] }
func (members byLastFirst) Less(i, j int) bool { 
    if members[i].LastName < members[j].LastName {
       return true
    }
    if members[i].LastName > members[j].LastName {
       return false
    }
    return members[i].FirstName < members[j].FirstName
}

sort.Sort(byLastFirst(members))

Unless performance analysis shows that sorting is a hot spot, use the function that's most convenient for your application.

By abourget (Votes: 22)

Use the newer sort.Slice function as such:

sort.Slice(members, func(i, j int) bool {
    switch strings.Compare(members[i].FirstName, members[j].FirstName) {
    case -1:
        return true
    case 1:
        return false
    }
    return members[i].LastName > members[j].LastName
})

or something like that.

By LVB (Votes: 10)

Another pattern, which I find slightly cleaner:

if members[i].LastName != members[j].LastName {
    return members[i].LastName < members[j].LastName
}

return members[i].FirstName < members[j].FirstName

@neverendingqs
Copy link
Owner Author

/so [sql-server] upsert

@github-actions
Copy link
Contributor

Results for [sql-server] upsert

UPSERT in SSIS (Votes: 6)

Asked by Raj More.

Question

I am writing an SSIS package to run on SQL Server 2008. How do you do an UPSERT in SSIS?

IF KEY NOT EXISTS
  INSERT
ELSE
  IF DATA CHANGED
    UPDATE
  ENDIF
ENDIF
Answers

By John Saunders (Votes: 12)

See SQL Server 2008 - Using Merge From SSIS. I've implemented something like this, and it was very easy. Just using the BOL page Inserting, Updating, and Deleting Data using MERGE was enough to get me going.

By wp78de (Votes: 4)

Apart from T-SQL based solutions (and this is not even tagged as sql/tsql), you can use an SSIS Data Flow Task with a Merge Join as described here (and elsewhere).

enter image description here

The crucial part is the Full Outer Join in the Merger Join (if you only want to insert/update and not delete a Left Outer Join works as well) of your sorted sources.

enter image description here

followed by a Conditional Split to know what to do next: Insert into the destination (which is also my source here), update it (via SQL Command), or delete from it (again via SQL Command).

  1. INSERT: If the gid is found only on the source (left)
  2. UPDATE If the gid exists on both the source and destination
  3. DELETE: If the gid is not found in the source but exists in the destination (right)

enter image description here

By Arnkrishn (Votes: 3)

I would suggest you to have a look at Mat Stephen's weblog on SQL Server's upsert.

SQL 2005 - UPSERT: In nature but not by name; but at last!

Upsert from select (Votes: 0)

Asked by Chsir17.

Question

I need to perform an upsert from select like this but in SQL Server. I found someone doing an upsert in SQL Server here, but that's not from a select.

My query basically looks like this right now:

INSERT INTO (table1) (...)
    SELECT (...)
    FROM (table 2)
    WHERE X NOT IN (SELECT Y from (table1) WHERE Y IS NOT NULL)

But I can't figure out how to add the update part. I would like to add the update part after since 99% of the time it will be an insert.

Edit: this code would work if I was able to do it from a SELECT:

BEGIN TRY
    INSERT INTO table1 (id, name, itemname, itemcatName, itemQty)
    VALUES ('val1', 'val2', 'val3', 'val4', 'val5')
END TRY
BEGIN CATCH
    UPDATE table1 
    SET name = 'val2', 
        itemname = 'val3', 
        itemcatName = 'val4', 
        itemQty = 'val5'
    WHERE id = 'val1'
END CATCH
Answers

By George Menoutis (Votes: 1)

Kind the same line of thinking, I guess:

update t1
set ......
from table1 t1
inner join table2 on t1.Y=table2.X

Notice the mandatory use of alias for the updated table, in order to update based on a join.

Alternative, you coud look around (or wait for an answer) for MERGE. It is very elegant to do things like upsert with one statement, but I haven't concluded if probable downsides are worth it.

By Dinesh (Votes: 0)

update table1 
  set name = 'val2', itemname = 'val3', itemcatName = 'val4', itemQty = 'val5'
  where id = 'val1'

IF @@ROWCOUNT = 0
insert into table1(id, name, itemname, itemcatName, itemQty)
values('val1', 'val2', 'val3', 'val4', 'val5')

Do an UPSERT: http://en.wikipedia.org/wiki/Upsert

By Mazhar (Votes: 0)

If you want to INSERT if a record doesn't exist or UPDATE if it does then try this

IF EXISTS (SELECT 1 FROM table1 WHERE somecol = @someval)
BEGIN
    UPDATE T1
    SET T1.someCol1 = T2.someCol1
        ,T1.someCol2 = T2.someCol2 
         --, etc,  etc
    FROM table1 T1
    INNER JOIN table2 T2 ON T1.Id = T2.T1_FK
END
ELSE
BEGIN
    INSERT INTO table1 (someCol, someCol2 --,etcetc)
    SELECT T2.someCol1, T2.someCol2 --etc etc
    FROM table2 T2
    WHERE somecol = @someval
END

SSIS Pragmatic works task factory upsert error (Votes: 0)

Asked by Mridul sharma.

Question

We have used "pragmatic works task factory" 2 components i.e. "SalesForce.com Source" and "Upsert Destination".

Local:
All things are working fine on local. TF sales force source and Upsert Destination both are working fine. Upsert Destination is using '.Net Providers\SqlClient Data Provider'.

Server:
After Deployment I am getting below error.

"Data Flow Task:Error: There were errors during task validation."   

"Data Flow Task:Error: TF Upsert Destination failed validation and returned error code 0xC0010009."     

"Data Flow Task:Error: One or more component failed validation."    

"Data Flow Task:Error: System.Runtime.InteropServices.COMException (0xC0010009): Exception from HRESULT: 0xC0010009
   at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSCustomPropertyCollection100.get_Item(Object Index)
   at PW.TaskFactory.UpsertDestinationTask.UpsertDestination.Validate()
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostValidate(IDTSManagedComponentWrapper100 wrapper)"


We are using SQL server 2012 on server.I have searched on internet but not find any solution for this.Thanks for any help.

Answers

@neverendingqs
Copy link
Owner Author

/so [javascript] array deconstruct

@github-actions
Copy link
Contributor

Results for [javascript] array deconstruct

How to perform Array deconstruct and then retrieve deconstructed object's value at the same time in JavaScript? (Votes: 2)

Asked by Joseph Wang.

Question

Please see this minimum example

const [one, two, three] = [
  { name: "Joseph" },
  { name: "Amy" },
  { name: "Jim" }
];

I can array deconstruct first level, how can I do it with second level like this:

const [oneName: one.name, twoName: two.name, threeName: three.name] = [
  { name: "Joseph" },
  { name: "Amy" },
  { name: "Jim" }
];

It's pseudocode, the syntax is wrong, but you get the idea.

Is it possible to do this in JavaScript?

Answers

By CertainPerformance (Votes: 7)

It's a bit similar to as if you were declaring an array of objects, only with a const in front of it. Instead of putting the values on the right-hand side of a :, put the new variable name to put the string there into:

const [{name: oneName}, {name: twoName}, {name: threeName}] = [
  { name: "Joseph" },
  { name: "Amy" },
  { name: "Jim" }
];

console.log(oneName);
console.log(twoName);
console.log(threeName);

But while this is possible to do, it's weird and (IMO) pretty unreadable. I wouldn't recommend it.

Deconstructing an array in increments of two (Votes: 6)

Asked by JimmyNeedles.

Question

Say I have an array like so:

let arr = [1, 2, 3, 4, 5, 6, "a"]

How can I deconstruct it in into individual variables in increments of two? So that let one = [1, 2], let two = [3, 4], etc? I know that you can deconstruct an array using individual variables like so:

let one, two, three;
[one, two, three] = arr

But doing it in increments of two, is that possible?

Answers

By Akrion (Votes: 8)

Another way you could do this is via ES6 generator function:

let arr = [1, 2, 3, 4, 5, 6, 'a']

function* batch (arr, n=2) {
let index = 0
while (index < arr.length) {
yield arr.slice(index, index + n)
index += n
}
}

let [first, second, third] = batch(arr)

console.log(first)
console.log(second)
console.log(third)

Or even more generic approach as kindly suggested by @Patrick Roberts:

let arr = [1, 2, 3, 4, 5, 6, 'a']

function* batch (iterable, length=2) {
let array = [];
for (const value of iterable) {
if (array.push(value) === length) {
yield array;
array = [];
}
}
if (array.length) yield array;
}

let [first, second, third] = batch(arr);

console.log(first)
console.log(second)
console.log(third)

And finally a curried version:

let arr = [1, 2, 3, 4, 5, 6, 'a']

const batch = length => function* (iterable) {
let array = [];
for (const value of iterable) {
if (array.push(value) === length) {
yield array;
array = [];
}
}
if (array.length) yield array;
}

let [first, second, third] = batch(2)(arr);

console.log(first)
console.log(second)
console.log(third)

By Nina Scholz (Votes: 5)

You could take an array and assign the value to the explicit target.

An automatic assignment is on the left hand side not possible by using a spread syntax with arrays with a length of two.

let array = [1, 2, 3, 4, 5, 6, "a"],
    one = [], two = [], three = [];

[one[0], one[1], two[0], two[1], three[0], three[1]] = array;

console.log(one);
console.log(two);
console.log(three);

By zer00ne (Votes: 4)

.flatMap() is a combination of .map() and .flat() it allows the callback to skip returning values, return a value, and return multiple values which is made possible by returning arrays which get flattened in the final step. This demo has a callback which is a ternary that returns 2 values in an array every other iteration while it returns an empty array on alternating iterations. The result is an array of arrays which is then destructured.

let arr = [1, 2, 3, 4, 5, 6, "a"];

let newArr = arr.flatMap((node, index, array) => index % 2 !== 0 ? [
[array[index - 1], node]
] : []);

let [A, B, C] = newArr;

console.log(A);
console.log(B);
console.log(C);

Type safe version of lodash _.get - deconstruct array type conditionally (Votes: 5)

Asked by caesay.

Question

For a long time we've been having a problem that the only way to access a nested property safely and easily is to use _.get. Ex:

_.get(obj, "Some.Nested[2].Property", defaultValue);

This works great, but doesn't stand up to property renames as frequently happens. Theoretically, it should be possible to transform the above into the following and allow TypeScript to implicitly type check it:

safeGet(obj, "Some", "Nested", 2, "Property", defaultValue);

I've been successful in creating such a typing for everything except for array types:

function getSafe<TObject, P1 extends keyof TObject>(obj: TObject, p1: P1): TObject[P1];

function getSafe<TObject, P1 extends keyof TObject, P2 extends keyof TObject[P1]>(obj: TObject, p1: P1, p2: P2): TObject[P1][P2];

This properly checks for items at depth (I will auto generate these statements to 10 levels or so). It fails with array properties because the type passed into the next parameter is T[] and not T.

The complexity or verbosity of any solution is of no consideration as the code will be auto generated, the problem is that I can't seem to find any combination of type declarations which will allow me to accept an integer parameter and deconstruct the array type moving forward.

You can deconstruct an array (where T is an array) using T[number]. The problem is that I have no way to constrain where T is an array on a nested property.

function getSafe<TObject, P1 extends keyof TObject, P2 extends keyof TObject[P1][number]>(obj: TObject, p1: P1, index: number, p2: P2): TObject[P1][number][P2];
                                                                                 ^^^^^^                                                             ^^^^^^
const test2 = getSafe(obj, "Employment", 0, "Id"); // example usage

That actually works at the call-site (no errors there, correctly gives us param and return types), but gives us an error in the declaration itself because you can't index TObject[P1] with [number] as we can't guarantee TObject[P1] is an array.

(Note: TType[number] is a viable way to get the element type from an array type, but we need to convince the compiler that we're doing this to an array)

The question is really, would there be away to add a array constraint to TObject[P1] or is there another way to do this that I'm missing.

Answers

By caesay (Votes: 4)

I've since figured this out and published an npm package here: ts-get-safe

The key point was figuring out how to conditionally restructure the array into it's element type. In order to do that, you first have to assert that all the properties are either an array or never. The type that solved the equation was:

type GSArrEl<TKeys extends keyof TObj, TObj> = { [P in TKeys]: undefined[] & TObj[P] }[TKeys][number];

The magic is in { [P in TKeys]: undefined[] & TObj[P] } where we essentially union each property of TObj to undefined[]. Because we are then sure that each property is either an array or never (it will be never on each property which is not an array), we can then do the destructuring expression [number] to get the element type.

Here is an example of two array destructuring happening at the same time:

function getSafe<TObject, P0 extends keyof TObject, A1 extends GSArrEl<P0, TObject>, P2 extends keyof A1, P3 extends keyof A1[P2], A4 extends GSArrEl<P3, A1[P2]>>(obj: TObject, p0: P0, a1: number, p2: P2, p3: P3, a4: number): A4;

Hundreds of combinations of arrays and object properties have been generated in my ts-get-safe library and are ready to use, however I'm still open to ways to improve this in a generic way such that we can use a dynamic number of parameters in the same declaration. Even a way to combine Array and Property navigation into the same type constraint so we don't have to generate every variation of array and property access.

@neverendingqs
Copy link
Owner Author

/so [github-actions] hub

@github-actions
Copy link
Contributor

Results for [github-actions] hub

Using elm-test in a git hub action (Votes: 2)

Asked by Ackdari.

Question

I want to use the an git hub action to test and build my elm package whenever a commit is pushed to the master branch for this my action .yml file looks like this

name: CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - run: |
        sudo npm install -g elm-test # this fails
        elm-test
        elm make

for testing I want to use elm-test which can be installed via npm but the command sudo npm install -g elm-test fails with

/usr/local/bin/elm-test -> /usr/local/lib/node_modules/elm-test/bin/elm-test

> [email protected] install /usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json
> binwrap-install

ERR Error: EACCES: permission denied, mkdir '/usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json/unpacked_bin'
    at Object.mkdirSync (fs.js:823:3)
    at /usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:46:10
    at new Promise (<anonymous>)
    at untgz (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:21:10)
    at binstall (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:11:12)
    at install (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/install.js:20:10)
    at Object.install (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/index.js:14:14)
    at Object.<anonymous> (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/bin/binwrap-install:18:9)
    at Module._compile (internal/modules/cjs/loader.js:955:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:991:10)
    at Module.load (internal/modules/cjs/loader.js:811:32)
    at Function.Module._load (internal/modules/cjs/loader.js:723:14)
    at Function.Module.runMain (internal/modules/cjs/loader.js:1043:10)
    at internal/main/run_main_module.js:17:11 {
  errno: -13,
  syscall: 'mkdir',
  code: 'EACCES',
  path: '/usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json/unpacked_bin'
}
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/elm-test/node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: `binwrap-install`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/runner/.npm/_logs/2020-02-03T17_50_06_232Z-debug.log

Any advice on how to install elm-test inside a git hub action?

Edit: Without the sudo the error becomes

npm WARN checkPermissions Missing write access to /usr/local/lib/node_modules
npm ERR! code EACCES
npm ERR! syscall access
npm ERR! path /usr/local/lib/node_modules
npm ERR! errno -13
npm ERR! Error: EACCES: permission denied, access '/usr/local/lib/node_modules'
npm ERR!  [Error: EACCES: permission denied, access '/usr/local/lib/node_modules'] {
npm ERR!   stack: "Error: EACCES: permission denied, access '/usr/local/lib/node_modules'",
npm ERR!   errno: -13,
npm ERR!   code: 'EACCES',
npm ERR!   syscall: 'access',
npm ERR!   path: '/usr/local/lib/node_modules'
npm ERR! }
npm ERR! 
npm ERR! The operation was rejected by your operating system.
npm ERR! It is likely you do not have the permissions to access this file as the current user
npm ERR! 
npm ERR! If you believe this might be a permissions issue, please double-check the
npm ERR! permissions of the file and its containing directories, or try running
npm ERR! the command again as root/Administrator.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/runner/.npm/_logs/2020-02-04T13_41_34_534Z-debug.log
Answers

By Ackdari (Votes: 3)

Thanks to @toastal comment I found another solution. That is to set up a package.json file using npm and then adding elm-test as a dependency with the command

npm install -D elm-test

also you might want to add node_modules to your .gitignore to ignore npms installation folder.

Then in the yml file you can just run the command npm install and elm-test gets installed. Then you can call it with

./node_modules/elm-test/bin/elm-test

My yml file now looks like this

name: Tests

on: [push]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - name: install npm dependencies
      run: npm install
    - name: Test
      run: ./node_modules/elm-test/bin/elm-test
    - name: check documentation
      run: |
        elm make --docs=docs.json
        rm docs.json

By Ackdari (Votes: 2)

Thanks to @glennsl I found an solution. The solution is to specify where npm installs package globally. Since the runner that runs the action has no access to the /usr/local/lib npm can't install anything on a global level. The solution (as described here) is to create an folder as a "global" installation folder and configuring npm to use it. It is also necessary to add the new folder to the PATH environment variable.

So my yml file now looks like this

name: CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - run: |
        mkdir ~/.npm-global
        npm config set prefix '~/.npm-global'
        PATH=~/.npm-global/bin:$PATH
        npm install -g elm-test
        elm-test
        elm make

Where should you put the git hub workflow directory for actions in a full-stack project? (Votes: 1)

Asked by Austin S.

Question

I am building a full-stack react project with a node backend and would like to implement a git hub actions to automate pull requests for continuous integration. Currently, in my root directory, I have my project split into two directories a front-end directory holding the client-side application and a back-end directory container the serverside. My question is where should I place the .github directory and associated .workflow directory in the project? At the root level or in either the client-side or server-side directories?

enter image description here

Answers

By Lucas Golven (Votes: 1)

From Github documentation:

You must store workflows in the .github/workflows directory in the root of your repository.

(workflow file cannot be inside subdirectory, it must be in .github/workflows)

You can define a naming convention for your repository: To keep the same logic that you were trying to implement with the subfolders, you could have .github/workflows/economic_growth.back-end.yaml and .github/workflows/economic_growth.front-end.yaml.

Can I run a docker container with GitHub actions? (Votes: 1)

Asked by Bruno Brant.

Question

I'm want to perform CI with tests that depends on a container that is published in Docker Hub. Is it possible? Can I start the container and run tests that depends on it?

Answers

By scthi (Votes: 4)

Yes, here is an example using docker-compose:

  test:
    name: Test
    runs-on: ubuntu-latest
    steps:
      - name: Check out code
        uses: actions/checkout@v1

      - name: Login to Docker Registry
        run: docker login "$DOCKER_REGISTRY" -u "$DOCKER_USERNAME" --password-stdin <<< "$DOCKER_PASSWORD"
        env:
          DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
          DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
          DOCKER_REGISTRY: ${{ secrets.DOCKER_REGISTRY }}
      - name: Start env
        run: docker-compose up
      - name: Run tests
        run: ...

@hedyhli
Copy link

hedyhli commented Aug 23, 2020

/so how to exit vim

@github-actions
Copy link
Contributor

Results for how to exit vim

How to temporarily exit Vim and go back (Votes: 306)

Asked by Ricky.

Question

How could I exit Vim, not :q, and then go back to continue editing?

Answers

By zen (Votes: 575)

Assuming terminal Vim on a flavor of *nix:

To suspend your running Vim

Ctrl + Z

will suspend the process and get back to your shell

fg

will resume (bring to foreground) your suspended Vim.

To start a new shell

Start a subshell using:

:sh

(as configured by)

:set shell?

or

:!bash

followed by:

Ctrl+D (or exit, but why type so much?)

to kill the shell and return to Vim.

By Pierre-Antoine LaFayette (Votes: 107)

You can use :sh to exit to your default shell then typing $ exit at the shell prompt will return you to Vim.

By Jay Zeng (Votes: 34)

You can switch to shell mode temporarily by:

:! <command>

such as

:! ls

How to exit from "vim -y" in console? (Votes: 15)

Asked by rdo.

Question

Accidentally I typed vim -y install python-requests instead of yum ... and I do not know how to exit from vim now. Standard option with shift + : + q! does not work. Are there any options how to exit from vim without killing it?

Answers

By Ingo Karkat (Votes: 17)

With -y (easy mode), Vim defaults to insert mode, and you cannot permanently exit to normal mode via <Esc>. However, like in default Vim, you can issue a single normal mode command via <C-O>. So to exit, type <C-O>:q!<CR>.

Alternatively, there's a special <C-L> mapping for easy mode that returns to normal mode.

By dotslashlu (Votes: 6)

-y option makes vim start in easy mode, you can type CTRL-L to return to normal mode and then type :q! to exit.

How to exit Ex mode in vim (typing :visual does not work) (Votes: 2)

Asked by AlbertMunichMar.

Question

I don't know how to exit Ex mode in vim. I think I tipped something that is still open and don't know how to close it:

<uments/LeWagon/Teacher_Lessons theory.rb                                                                                                                                            
Entering Ex mode.  Type "visual" to go to Normal mode.
:w
"theory.rb" 22L, 472C written
:i
:w
:q
visual
:visual
visual
^W^W^W^W^K^W
kj^W
exit
:exit
:visual
visual

Thanks

Answers

By Ry- (Votes: 8)

You got into insert mode inside Ex mode with :i, so you need to leave it with a line containing only a period:

.

Then :visual will work.

In other words: Enter . Enter v i s u a l Enter. ^C should also work, i.e. Ctrl+C v i Enter.

@hedyhli
Copy link

hedyhli commented Aug 23, 2020

cool :)

@AndreKR
Copy link

AndreKR commented Aug 23, 2020

So it seems when your GitHub username matches the one on Stackoverflow you get mentioned? 😆

@hedyhli
Copy link

hedyhli commented Aug 23, 2020

What do you mean?

@AndreKR
Copy link

AndreKR commented Aug 23, 2020

Well, I believe I have nothing to do with this project at all, but I'm getting notifications...

image

Presumably this is because I was mentioned here: #1 (comment) when my Stackoverflow answer was copied to GitHub. My GitHub username is the same as my Stackoverflow username, that's probably why it made the connection?

@glennsl
Copy link

glennsl commented Aug 23, 2020

The first result on this post mentions me on SO by name ("Thanks to @glennsl ..."). which causes github to send me a notification that I've been mentioned in this thread. Would it be possible to escape @ mentions on SO so that this doesn't happen?

@hedyhli
Copy link

hedyhli commented Aug 23, 2020

@neverendingqs @glennsl @AndreKR I think I found a way:
@<!---->glennsl
Is
@glennsl

@neverendingqs
Copy link
Owner Author

So sorry! I will take a look at first opportunity I have at my computer.

@WaylonWalker
Copy link

/so how to search stack overflow api

@github-actions
Copy link
Contributor

Results for how to search stack overflow api

How to get questions using StackOverflow API V 2.2 in PHP (Votes: 2)

Asked by sachinmanit.

Question

I am working on a project and want to get questions from stack overflow using Stack Overflow API. I searched the way how to achieve the same here:

  1. how to get a list of questions from stackoverflow API based on search query?
  2. Getting null as response from Stack Overflow API with PHP
  3. How to use Stack Overflow API in PHP

But I found nothing useful as they are old techniques. They are using old Stack Overflow API versions.

While going through stackapps I read that to achieve this task I have to register my app. I have registered an app on www.stackapps.com. and I got an APP ID and its SECRET KEY.

I visited here https://api.stackexchange.com/docs/authentication to know how to get data from stack API V 2.2. They have given useful links to get data using Stack Overflow API V 2.0 using authentication via OAuth 2.0

It says ask for a request from user and then get "code" etc. I got stuck here. What should be the process in order to move forward using PHP.

My app should do this:

Let input link be : https://api.stackexchange.com/2.2/questions?order=desc&sort=activity&site=stackoverflow

I get data of this page in any of the two form either HTML or JSON.

Can it be done without OAuth. If not, then please guide.

Answers

By rachitmanit (Votes: 3)

Assuming that you have registered your application.

Go to Manage your applications from here https://stackapps.com/apps/oauth

Enable Client side flow within App settings.

Now as you want to create Desktop application, following are steps:

  1. First you need to get Access Token by redirecting user to this link: https://stackexchange.com/oauth/dialog?client_id=[YOUR_APP_ID]&scope=private_info&redirect_uri=https://stackexchange.com/oauth/login_success

  2. You (as a user) will approve the request made by the application. Then you will be redirected to another link in which there will be an access_token.

  3. Grab the access token from there and put it in this link :

https://api.stackexchange.com/2.2/questions?order=descsort=activity&site=stackoverflow&key=[YOUR_APP_KEY]&access_token=[YOUR_ACCESS_TOKEN]&filter=withbody

This will be your ready API.

Get content using:

$context = stream_context_create(array('https' => array('header'=>'Connection: close\r\n')));
$json_array = file_get_contents("YOUR_API_URL", false, $context);
$data = json_decode(gzdecode($json_array),true);
print_r($data); // Show your file data

The data you will get is in JSON and is compressed (in GZIP from). So we decompressed it and then decoded JSON.

This should work. It worked for me. :)

Stack Overflow job API? (Votes: -1)

Asked by pmilo.

Question

How can I get access to Stack Overflow's job API (subscription based or otherwise). I can only find Stack Overflow's jobs RSS feed, but the data available is limited with inconsistancies in format.

I'm building a Job search platform for my portfolio, I'm currently using the Adzuna Job Search API, but again their data is limited and they don't offer subscription services for the feature sets I want to build.

At the very least, I'm looking for a full job description with consistantly formatted title, location and employment type with a job ad url.

Can anyone recommend a usable jobs search/list API?

Answers

How to instantly inform search engines(google) to index new url's (Votes: -1)

Asked by F.E Noel Nfebe.

Question

I am working on creating a Q&A site like stack overflow for particular topics. Just like on Stack overflow when a new question is asked a dynamic url is generated for example mysite.com/questions/{id}/{title} how can I get search engines to index these new url's immediately a new question is asked. Like

if(question.asked()){
url = question.url;
search_engines.index(url);
}

Is there any API provided by popular search engines like Google and Bing to automatically inform them about new url's? Thanks in advance.

Answers

By Potky (Votes: 0)

  1. Set your new URL-s dynamically in xml sitemap. xml sitemap helps all search engines crawling your website.
  2. Go to Search Console or Bing Webmaster tools and ask them indexing your new URL-s.

They have their own opinion indexing URL-s. You have to wait. That's all what you can do.

By Chris Harris (Votes: 0)

Go to Google search console, Then click on Fetch as google under Crawl.
There you can submit your latest url and click OK.
Goolgle will fetch your new url and can be seen in search results.

By Kavita Sharma (Votes: 0)

add website to https://www.google.com/webmasters/tools/submit-url

add website to https://www.bing.com/toolbox/submit-site-url

add site map

divide content add make indivisual site map

like https://bskud.com

its have one site map then its divide content by states

https://bskud.com/HIMACHAL-PRADESH-SITE-MAP.php

https://bskud.com/JHARKHAND-SITE-MAP.php

@neverendingqs
Copy link
Owner Author

/so [github-actions] hub

@github-actions
Copy link
Contributor

Results for [github-actions] hub

Using elm-test in a git hub action (Votes: 2)

Asked by Ackdari.

Question

I want to use the an git hub action to test and build my elm package whenever a commit is pushed to the master branch for this my action .yml file looks like this

name: CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - run: |
        sudo npm install -g elm-test # this fails
        elm-test
        elm make

for testing I want to use elm-test which can be installed via npm but the command sudo npm install -g elm-test fails with

/usr/local/bin/elm-test -> /usr/local/lib/node_modules/elm-test/bin/elm-test

> [email protected] install /usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json
> binwrap-install

ERR Error: EACCES: permission denied, mkdir '/usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json/unpacked_bin'
    at Object.mkdirSync (fs.js:823:3)
    at /usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:46:10
    at new Promise (<anonymous>)
    at untgz (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:21:10)
    at binstall (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:11:12)
    at install (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/install.js:20:10)
    at Object.install (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/index.js:14:14)
    at Object.<anonymous> (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/bin/binwrap-install:18:9)
    at Module._compile (internal/modules/cjs/loader.js:955:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:991:10)
    at Module.load (internal/modules/cjs/loader.js:811:32)
    at Function.Module._load (internal/modules/cjs/loader.js:723:14)
    at Function.Module.runMain (internal/modules/cjs/loader.js:1043:10)
    at internal/main/run_main_module.js:17:11 {
  errno: -13,
  syscall: 'mkdir',
  code: 'EACCES',
  path: '/usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json/unpacked_bin'
}
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/elm-test/node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: `binwrap-install`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/runner/.npm/_logs/2020-02-03T17_50_06_232Z-debug.log

Any advice on how to install elm-test inside a git hub action?

Edit: Without the sudo the error becomes

npm WARN checkPermissions Missing write access to /usr/local/lib/node_modules
npm ERR! code EACCES
npm ERR! syscall access
npm ERR! path /usr/local/lib/node_modules
npm ERR! errno -13
npm ERR! Error: EACCES: permission denied, access '/usr/local/lib/node_modules'
npm ERR!  [Error: EACCES: permission denied, access '/usr/local/lib/node_modules'] {
npm ERR!   stack: "Error: EACCES: permission denied, access '/usr/local/lib/node_modules'",
npm ERR!   errno: -13,
npm ERR!   code: 'EACCES',
npm ERR!   syscall: 'access',
npm ERR!   path: '/usr/local/lib/node_modules'
npm ERR! }
npm ERR! 
npm ERR! The operation was rejected by your operating system.
npm ERR! It is likely you do not have the permissions to access this file as the current user
npm ERR! 
npm ERR! If you believe this might be a permissions issue, please double-check the
npm ERR! permissions of the file and its containing directories, or try running
npm ERR! the command again as root/Administrator.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/runner/.npm/_logs/2020-02-04T13_41_34_534Z-debug.log
Answers

By Ackdari (Votes: 3)

Thanks to @toastal comment I found another solution. That is to set up a package.json file using npm and then adding elm-test as a dependency with the command

npm install -D elm-test

also you might want to add node_modules to your .gitignore to ignore npms installation folder.

Then in the yml file you can just run the command npm install and elm-test gets installed. Then you can call it with

./node_modules/elm-test/bin/elm-test

My yml file now looks like this

name: Tests

on: [push]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - name: install npm dependencies
      run: npm install
    - name: Test
      run: ./node_modules/elm-test/bin/elm-test
    - name: check documentation
      run: |
        elm make --docs=docs.json
        rm docs.json

By Ackdari (Votes: 2)

Thanks to @glennsl I found an solution. The solution is to specify where npm installs package globally. Since the runner that runs the action has no access to the /usr/local/lib npm can't install anything on a global level. The solution (as described here) is to create an folder as a "global" installation folder and configuring npm to use it. It is also necessary to add the new folder to the PATH environment variable.

So my yml file now looks like this

name: CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - run: |
        mkdir ~/.npm-global
        npm config set prefix '~/.npm-global'
        PATH=~/.npm-global/bin:$PATH
        npm install -g elm-test
        elm-test
        elm make

Where should you put the git hub workflow directory for actions in a full-stack project? (Votes: 1)

Asked by Austin S.

Question

I am building a full-stack react project with a node backend and would like to implement a git hub actions to automate pull requests for continuous integration. Currently, in my root directory, I have my project split into two directories a front-end directory holding the client-side application and a back-end directory container the serverside. My question is where should I place the .github directory and associated .workflow directory in the project? At the root level or in either the client-side or server-side directories?

enter image description here

Answers

By Lucas Golven (Votes: 1)

From Github documentation:

You must store workflows in the .github/workflows directory in the root of your repository.

(workflow file cannot be inside subdirectory, it must be in .github/workflows)

You can define a naming convention for your repository: To keep the same logic that you were trying to implement with the subfolders, you could have .github/workflows/economic_growth.back-end.yaml and .github/workflows/economic_growth.front-end.yaml.

Can I run a docker container with GitHub actions? (Votes: 1)

Asked by Bruno Brant.

Question

I'm want to perform CI with tests that depends on a container that is published in Docker Hub. Is it possible? Can I start the container and run tests that depends on it?

Answers

By scthi (Votes: 4)

Yes, here is an example using docker-compose:

  test:
    name: Test
    runs-on: ubuntu-latest
    steps:
      - name: Check out code
        uses: actions/checkout@v1

      - name: Login to Docker Registry
        run: docker login "$DOCKER_REGISTRY" -u "$DOCKER_USERNAME" --password-stdin <<< "$DOCKER_PASSWORD"
        env:
          DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
          DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
          DOCKER_REGISTRY: ${{ secrets.DOCKER_REGISTRY }}
      - name: Start env
        run: docker-compose up
      - name: Run tests
        run: ...

@neverendingqs
Copy link
Owner Author

!! Sorry I tested this locally and it worked...

@neverendingqs
Copy link
Owner Author

/so [github-actions] hub

@github-actions
Copy link
Contributor

Results for [github-actions] hub

Using elm-test in a git hub action (Votes: 2)

Asked by Ackdari.

Question

I want to use the an git hub action to test and build my elm package whenever a commit is pushed to the master branch for this my action .yml file looks like this

name: CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - run: |
        sudo npm install -g elm-test # this fails
        elm-test
        elm make

for testing I want to use elm-test which can be installed via npm but the command sudo npm install -g elm-test fails with

/usr/local/bin/elm-test -> /usr/local/lib/node_modules/elm-test/bin/elm-test

> [email protected] install /usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json
> binwrap-install

ERR Error: EACCES: permission denied, mkdir '/usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json/unpacked_bin'
    at Object.mkdirSync (fs.js:823:3)
    at /usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:46:10
    at new Promise (<anonymous>)
    at untgz (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:21:10)
    at binstall (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/binstall.js:11:12)
    at install (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/install.js:20:10)
    at Object.install (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/index.js:14:14)
    at Object.<anonymous> (/usr/local/lib/node_modules/elm-test/node_modules/binwrap/bin/binwrap-install:18:9)
    at Module._compile (internal/modules/cjs/loader.js:955:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:991:10)
    at Module.load (internal/modules/cjs/loader.js:811:32)
    at Function.Module._load (internal/modules/cjs/loader.js:723:14)
    at Function.Module.runMain (internal/modules/cjs/loader.js:1043:10)
    at internal/main/run_main_module.js:17:11 {
  errno: -13,
  syscall: 'mkdir',
  code: 'EACCES',
  path: '/usr/local/lib/node_modules/elm-test/node_modules/elmi-to-json/unpacked_bin'
}
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/elm-test/node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: `binwrap-install`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/runner/.npm/_logs/2020-02-03T17_50_06_232Z-debug.log

Any advice on how to install elm-test inside a git hub action?

Edit: Without the sudo the error becomes

npm WARN checkPermissions Missing write access to /usr/local/lib/node_modules
npm ERR! code EACCES
npm ERR! syscall access
npm ERR! path /usr/local/lib/node_modules
npm ERR! errno -13
npm ERR! Error: EACCES: permission denied, access '/usr/local/lib/node_modules'
npm ERR!  [Error: EACCES: permission denied, access '/usr/local/lib/node_modules'] {
npm ERR!   stack: "Error: EACCES: permission denied, access '/usr/local/lib/node_modules'",
npm ERR!   errno: -13,
npm ERR!   code: 'EACCES',
npm ERR!   syscall: 'access',
npm ERR!   path: '/usr/local/lib/node_modules'
npm ERR! }
npm ERR! 
npm ERR! The operation was rejected by your operating system.
npm ERR! It is likely you do not have the permissions to access this file as the current user
npm ERR! 
npm ERR! If you believe this might be a permissions issue, please double-check the
npm ERR! permissions of the file and its containing directories, or try running
npm ERR! the command again as root/Administrator.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/runner/.npm/_logs/2020-02-04T13_41_34_534Z-debug.log
Answers

By Ackdari (Votes: 3)

Thanks to `@toastal` comment I found another solution. That is to set up a package.json file using npm and then adding elm-test as a dependency with the command

npm install -D elm-test

also you might want to add node_modules to your .gitignore to ignore npms installation folder.

Then in the yml file you can just run the command npm install and elm-test gets installed. Then you can call it with

./node_modules/elm-test/bin/elm-test

My yml file now looks like this

name: Tests

on: [push]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - name: install npm dependencies
      run: npm install
    - name: Test
      run: ./node_modules/elm-test/bin/elm-test
    - name: check documentation
      run: |
        elm make --docs=docs.json
        rm docs.json

By Ackdari (Votes: 2)

Thanks to `@glennsl` I found an solution. The solution is to specify where npm installs package globally. Since the runner that runs the action has no access to the /usr/local/lib npm can't install anything on a global level. The solution (as described here) is to create an folder as a "global" installation folder and configuring npm to use it. It is also necessary to add the new folder to the PATH environment variable.

So my yml file now looks like this

name: CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Setup Elm environment
      uses: JorelAli/setup-elm@v1
      with:
        # Version of Elm to use. E.g. 0.19.1
        elm-version: 0.19.1
    - run: |
        mkdir ~/.npm-global
        npm config set prefix '~/.npm-global'
        PATH=~/.npm-global/bin:$PATH
        npm install -g elm-test
        elm-test
        elm make

Where should you put the git hub workflow directory for actions in a full-stack project? (Votes: 1)

Asked by Austin S.

Question

I am building a full-stack react project with a node backend and would like to implement a git hub actions to automate pull requests for continuous integration. Currently, in my root directory, I have my project split into two directories a front-end directory holding the client-side application and a back-end directory container the serverside. My question is where should I place the .github directory and associated .workflow directory in the project? At the root level or in either the client-side or server-side directories?

enter image description here

Answers

By Lucas Golven (Votes: 1)

From Github documentation:

You must store workflows in the .github/workflows directory in the root of your repository.

(workflow file cannot be inside subdirectory, it must be in .github/workflows)

You can define a naming convention for your repository: To keep the same logic that you were trying to implement with the subfolders, you could have .github/workflows/economic_growth.back-end.yaml and .github/workflows/economic_growth.front-end.yaml.

Can I run a docker container with GitHub actions? (Votes: 1)

Asked by Bruno Brant.

Question

I'm want to perform CI with tests that depends on a container that is published in Docker Hub. Is it possible? Can I start the container and run tests that depends on it?

Answers

By scthi (Votes: 4)

Yes, here is an example using docker-compose:

  test:
    name: Test
    runs-on: ubuntu-latest
    steps:
      - name: Check out code
        uses: actions/checkout@v1

      - name: Login to Docker Registry
        run: docker login "$DOCKER_REGISTRY" -u "$DOCKER_USERNAME" --password-stdin <<< "$DOCKER_PASSWORD"
        env:
          DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
          DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
          DOCKER_REGISTRY: ${{ secrets.DOCKER_REGISTRY }}
      - name: Start env
        run: docker-compose up
      - name: Run tests
        run: ...

@neverendingqs
Copy link
Owner Author

It's fixed! So sorry about that.

@Vadorequest
Copy link

/so How to debug Github Actions?

@github-actions
Copy link
Contributor

Results for How to debug Github Actions?

Unexpected github action trigger (Votes: 1)

Asked by Nathan Atkins.

Question

This github action runs on a pull request to develop. This isn't what I expected and I could use some fishing tips on how to understand/debug why it is triggered.

on:
  push:
    branches:
    - master
  pull_request:
    branches:
    - master

develop is the default branch for the repository. I mention this because this action isn't triggered when I push to a feature branch. It is triggered on a pull request to master.

Answers

anaconda subprocess.CalledProcessError: Command returned non-zero exit status 1 (Votes: 1)

Asked by user2426998.

Question

I am use Github Actions for uploading into anaconda channel. it gives an error that I can't debug. Every component is working okay but I can't find any similar error. Could you provide a hint how to debug this error?

This is meta.yaml

{% set version = "0.1.1rc42" %}

package:
  name: blablablabla
  version: {{ version }}

source:
  url: https://files.pythonhosted.org/packages/source/e/blablablabla/blablablabla-0.1.1rc42.tar.gz

build:
  noarch: python
  number: 0
  script: python -m pip install --no-deps --ignore-installed .

requirements:
  host:
    - python
    - pip
    - numpy==1.16.5
    - cython
  run:
    - python

test:
  imports:
    - blablablabla

about:
  home: https://github.com/blablablabla-author/blablablabla
  license: GPLv3
  summary: 'blablablabla'
  description: |
    blablablabla 
  dev_url: https://github.com/bowman-lab/blablablabla
  doc_url: https://pypi.org/project/blablablabla/
  doc_source_url: https://blablablabla.readthedocs.io/en/latest/index.html

This is the code that uploads to anaconda.

    conda config --set anaconda_upload yes
    apt-get update
    apt-get install -y build-essential
    conda create -n myenv python=3.6
    activate myenv
    conda install --yes pip
    conda install --yes numpy cython
    conda install --yes -c conda-forge nose mdtraj  
    anaconda login --username $INPUT_ANACONDAUSERNAME --password $INPUT_ANACONDAPASSWORD
    echo $PWD
    conda build . 
    anaconda logout

This is the error that I get.

020-02-28T00:50:40.4600708Z INFO:conda_build.source:Downloading source to cache: blablablabla-0.1.1rc40.tar.gz
2020-02-28T00:50:40.4601418Z INFO conda_build.source:download_to_cache(66): Downloading source to cache: blablablabla-0.1.1rc40.tar.gz
2020-02-28T00:50:40.4602139Z INFO:conda_build.source:Downloading https://files.pythonhosted.org/packages/source/e/blablablabla/blablablabla-0.1.1rc40.tar.gz
2020-02-28T00:50:40.4602871Z INFO conda_build.source:download_to_cache(80): Downloading https://files.pythonhosted.org/packages/source/e/blablablabla/blablablabla-0.1.1rc40.tar.gz
2020-02-28T00:50:40.4603721Z Downloading source to cache: blablablabla-0.1.1rc40.tar.gz
2020-02-28T00:50:40.4604375Z Downloading https://files.pythonhosted.org/packages/source/e/blablablabla/blablablabla-0.1.1rc40.tar.gz
2020-02-28T00:50:40.6318549Z INFO:conda_build.source:Success
2020-02-28T00:50:40.6318756Z INFO conda_build.source:download_to_cache(91): Success
2020-02-28T00:50:40.6321187Z Success
2020-02-28T00:50:41.9181877Z     ERROR: Command errored out with exit status 1:
2020-02-28T00:50:41.9184004Z      command: /opt/conda/conda-bld/blablablabla_1582851022543/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_place/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-dljojnij/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-dljojnij/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-req-build-dljojnij/pip-egg-info
2020-02-28T00:50:41.9184761Z          cwd: /tmp/pip-req-build-dljojnij/
2020-02-28T00:50:41.9185124Z     Complete output (11 lines):
2020-02-28T00:50:41.9185456Z     Traceback (most recent call last):
2020-02-28T00:50:41.9187027Z       File "<string>", line 1, in <module>
2020-02-28T00:50:41.9187934Z       File "/tmp/pip-req-build-dljojnij/setup.py", line 106, in <module>
2020-02-28T00:50:41.9188330Z         ext_modules=cythonize(cython_extensions),
2020-02-28T00:50:41.9189305Z       File "/opt/conda/conda-bld/blablablabla_1582851022543/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_place/lib/python3.7/site-packages/Cython/Build/Dependencies.py", line 971, in cythonize
2020-02-28T00:50:41.9189776Z         aliases=aliases)
2020-02-28T00:50:41.9190729Z       File "/opt/conda/conda-bld/blablablabla_1582851022543/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_place/lib/python3.7/site-packages/Cython/Build/Dependencies.py", line 815, in create_extension_list
2020-02-28T00:50:41.9191479Z         for file in nonempty(sorted(extended_iglob(filepattern)), "'%s' doesn't match any files" % filepattern):
2020-02-28T00:50:41.9192445Z       File "/opt/conda/conda-bld/blablablabla_1582851022543/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_place/lib/python3.7/site-packages/Cython/Build/Dependencies.py", line 114, in nonempty
2020-02-28T00:50:41.9192927Z         raise ValueError(error_msg)
2020-02-28T00:50:41.9193461Z     ValueError: 'blablablabla/info_theory/libinfo.pyx' doesn't match any files
2020-02-28T00:50:41.9193987Z     ----------------------------------------
2020-02-28T00:50:41.9221151Z ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
2020-02-28T00:50:42.7756204Z Extracting download
2020-02-28T00:50:42.7757038Z source tree in: /opt/conda/conda-bld/blablablabla_1582851022543/work
2020-02-28T00:50:42.7757877Z export PREFIX=/opt/conda/conda-bld/blablablabla_1582851022543/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_place
2020-02-28T00:50:42.7758325Z export BUILD_PREFIX=/opt/conda/conda-bld/blablablabla_1582851022543/_build_env
2020-02-28T00:50:42.7758673Z export SRC_DIR=/opt/conda/conda-bld/blablablabla_1582851022543/work
2020-02-28T00:50:42.7758827Z Processing $SRC_DIR
2020-02-28T00:50:42.7777905Z Traceback (most recent call last):
2020-02-28T00:50:42.7778886Z   File "/opt/conda/bin/conda-build", line 11, in <module>
2020-02-28T00:50:42.7779085Z     sys.exit(main())
2020-02-28T00:50:42.7779665Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/cli/main_build.py", line 469, in main
2020-02-28T00:50:42.7779865Z     execute(sys.argv[1:])
2020-02-28T00:50:42.7780417Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/cli/main_build.py", line 460, in execute
2020-02-28T00:50:42.7780871Z     verify=args.verify, variants=args.variants)
2020-02-28T00:50:42.7781423Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/api.py", line 209, in build
2020-02-28T00:50:42.7781632Z     notest=notest, need_source_download=need_source_download, variants=variants)
2020-02-28T00:50:42.7787528Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/build.py", line 2344, in build_tree
2020-02-28T00:50:42.7787785Z     notest=notest,
2020-02-28T00:50:42.7788445Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/build.py", line 1492, in build
2020-02-28T00:50:42.7793073Z     cwd=src_dir, stats=build_stats)
2020-02-28T00:50:42.7793801Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/utils.py", line 398, in check_call_env
2020-02-28T00:50:42.7794226Z     return _func_defaulting_env_to_os_environ('call', *popenargs, **kwargs)
2020-02-28T00:50:42.7794783Z   File "/opt/conda/lib/python3.7/site-packages/conda_build/utils.py", line 378, in _func_defaulting_env_to_os_environ
2020-02-28T00:50:42.7795034Z     raise subprocess.CalledProcessError(proc.returncode, _args)
2020-02-28T00:50:42.7795506Z subprocess.CalledProcessError: Command '['/bin/bash', '-o', 'errexit', '/opt/conda/conda-bld/blablablabla_1582851022543/work/conda_build.sh']' returned non-zero exit status 1.
Answers

Can't create (python) QApplication in github action (Votes: 1)

Asked by konserw.

Question

I've got some unit tests for my (python) Qt gui, which require QApplication instance, but creating one always fail for me (i.e. ends in core dumped and application abort at line with QApplication()). What I've tried so far is:

  • creation methods:
    • plain app = QApplication() on module level
    • app = QApplication(['--platform offscreen'])
    • using fixture from pytest-qt that manages QApplication object creation, i.e. passing qtbot to my tests
  • I've even tried both python ports of qt, i.e.:
    • PyQt5
    • PySide2
  • Virtual screens:

I've tried using https://github.com/nektos/act to debug this issue locally, but using this approach issue was not reproducible (i.e. everything worked as expected) until I've added herbstluftwm, i.e. only thing I was able to achieve is that locally it also started to fail.

What else I can check? Have you seen QApplication created successfully on github actions? BTW. How to get Qt's output to be visible in github actions? (I've added env: QT_DEBUG_PLUGINS: 1 and sill can't see any errors)

Answers

By konserw (Votes: 1)

Thanks to `@eyllanesc` request for MRE I've created this https://github.com/konserw/mre minimal example repo which allowed me to find solution on my own. It turns out that in you need to install xvfb and libxkbcommon-x11-0, but you must NOT run xvfb service or herbstluftwm. Then you need to run your test command (coverage in my case) using xvfb-run, which in case of github actions require absolute path to coverage, like that:

xvfb-run `which coverage` run -m pytest

I hope this would help future users of github actions struggling to get PyQt5 or PySide2 GUI tests running.

BTW. pytest was silencing output from Qt's QT_DEBUG_PLUGINS, so replacing test command with plain python call with some minimal script that reproduces problem was key here. see https://github.com/konserw/mre/runs/509156615?check_suite_focus=true

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants