Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed a major bug when reusing an operand #126

Merged
merged 4 commits into from
Apr 4, 2022
Merged

Conversation

chewxy
Copy link
Member

@chewxy chewxy commented Apr 4, 2022

The bug is best exemplified by the following:

func main() {
	a := tensor.New(tensor.WithShape(2, 2), tensor.WithBacking([]float64{1, 2, 3, 4}))
	b := tensor.New(tensor.WithShape(2, 2), tensor.WithBacking([]float64{2, 2, 3, 3}))
	_, err := tensor.Div(a, b, tensor.WithReuse(b))
	if err != nil {
		log.Fatal(err)
	}
	log.Printf("a %v", a)
	log.Printf("b %v", b)
}

When an operand is reused, weird things happen, and the results are incorrect. This has now been fixed

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.4%) to 21.541% when pulling 3fd5e14 on fixNonCommutative into 763642a on master.

@coveralls
Copy link

coveralls commented Apr 4, 2022

Coverage Status

Coverage decreased (-0.4%) to 21.512% when pulling f808797 on fixNonCommutative into 763642a on master.

@chewxy chewxy merged commit 9a56298 into master Apr 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants