Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to handle "sonic connection is closed" error at runtime? #14

Closed
fahmifan opened this issue Jan 19, 2020 · 6 comments
Closed

How to handle "sonic connection is closed" error at runtime? #14

fahmifan opened this issue Jan 19, 2020 · 6 comments

Comments

@fahmifan
Copy link

fahmifan commented Jan 19, 2020

When the app is running, how to properly handle "sonic connection is closed" error ? Can it implement retry to connect?

@alexisvisco
Copy link
Member

Hello, thanks you for your issue !

Can you provide an example of the problem please.

Have a nice day.

@fahmifan
Copy link
Author

fahmifan commented Feb 7, 2020

@alexisvisco it occurs at random time, it just got error like connection broken pipe. Actually i don't really know what the causes are, but somehow i make a workaround by create a separate go routine to ping Sonic. This is the wrapper i wrote

package sosonic

import (
	"sync"
	"time"

	"github.com/expectedsh/go-sonic/sonic"
	log "github.com/sirupsen/logrus"
)

// SoSonic :nodoc:
type SoSonic struct {
	host     string
	port     int
	secret   string
	Ingester sonic.Ingestable
	Searcher sonic.Searchable
}

// New :nodoc:
func New(host string, port int, secret string) *SoSonic {
	ingester, err := sonic.NewIngester(host, port, secret)
	if err != nil {
		log.Error(err)
	}

	searcher, err := sonic.NewSearch(host, port, secret)
	if err != nil {
		log.Error(err)
	}

	ss := &SoSonic{
		Ingester: ingester,
		Searcher: searcher,
		host:     host,
		port:     port,
		secret:   secret,
	}

	go ss.check()

	return ss
}

func (s *SoSonic) check() {
	for {
		mu := sync.Mutex{}
		mu.Lock()
		s.trySearcher()
		s.tryIngester()
		mu.Unlock()

		time.Sleep(10 * time.Second)
	}
}

func (s *SoSonic) tryIngester() {
	err := s.Ingester.Ping()
	if err != nil {
		log.Info("try reconnect ingester")
		ingester, err := sonic.NewIngester(s.host, s.port, s.secret)
		if err != nil {
			log.Error(err)
		} else {
			s.Ingester = ingester
			log.Info("ingester reconnected")
		}

		return
	}
}

func (s *SoSonic) trySearcher() {
	err := s.Searcher.Ping()
	if err != nil {
		log.Info("try reconnect searcher")
		searcher, err := sonic.NewSearch(s.host, s.port, s.secret)
		if err != nil {
			log.Error(err)
		} else {
			s.Searcher = searcher
			log.Info("searcher reconnected")
		}

		return
	}
}

With this wrapper it will try to reconnect to Sonic if the connection broken in the middle

@tanqhnguyen
Copy link

tanqhnguyen commented Jul 30, 2020

In case if anyone stumbles upon this same problem, it might be because of the tcp_timeout configuration, it's default to 300 seconds.

There are few ways that I can think of to solve this

  • Set tcp_timeout to a bigger number, not sure if there is a "no timeout" value, need to double check this
  • Implement a "connection pool" that regularly ping the connection to detect and remove dead connections

@alexisvisco
Copy link
Member

Thanks @tanqhnguyen for the reply, I am quite busy at the moment but when I will be available again I will implement the #11 with the ping connection in the connection pool in mind.

@tanqhnguyen
Copy link

@alexisvisco that's ok :) I happen to need this at the moment so I might as well try to implement the connection pool for it. I will try to put together a POC and we can discuss further. Of course if you happen to have anything WIP, we can continue from there

@fahmifan
Copy link
Author

I think, this issue can be closed then. Would like to see the connection poll implementation.
Many thanks for the help guys

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants