Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Elastic | 7,207 | 559 | 24 days ago | 309 | March 19, 2022 | 107 | mit | Go | ||
Deprecated: Use the official Elasticsearch client for Go at https://github.com/elastic/go-elasticsearch | ||||||||||
Raigad | 341 | 1 | 2 years ago | 3 | June 18, 2015 | 8 | apache-2.0 | Java | ||
Co-Process for backup/recovery, Auto Deployments and Centralized Configuration management for ElasticSearch | ||||||||||
Elasticsearch Logstash Index Mgmt | 274 | 7 years ago | 1 | mit | Shell | |||||
Bash scripts for managing backup, delete, and restore of elasticsearch indexes created by logstash. | ||||||||||
Elasticsearch Cloud Azure | 75 | 7 years ago | 17 | April 15, 2016 | 2 | |||||
Microsoft Azure discovery and snapshot/restore plugin for Elasticsearch | ||||||||||
Twly Voter Guide | 66 | 5 years ago | 14 | HTML | ||||||
立委投票指南 | ||||||||||
Elasticsearch S3 Backup | 29 | a month ago | 2 | mit | Python | |||||
Backup and restore elasticsearch snapshots to/from AWS S3 | ||||||||||
Es_dump_restore | 26 | 6 years ago | 20 | November 13, 2017 | 3 | mit | Ruby | |||
Saves and restores ElasticSearch index contents using a compressed file format | ||||||||||
Elasticsearch Snapshots | 22 | 5 years ago | 1 | mit | Python | |||||
Set of scripts to manage Elasticsearch snapshots in S3 | ||||||||||
Elasticsearch Logstash S3 Backup | 18 | 7 months ago | 2 | mit | Shell | |||||
Scripts to backup and restore Logstash indexes from Elasticsearch. Can be run as a cron on Aptible. | ||||||||||
Kibana4 Backup | 17 | 4 years ago | 13 | April 25, 2017 | 5 | mit | JavaScript | |||
Elastic is an Elasticsearch client for the Go programming language.
See the wiki for additional information about Elastic.
Notice that the master branch always refers to the latest version of Elastic. If you want to use stable versions of Elastic, you should use the packages released via gopkg.in.
Here's the version matrix:
Elasticsearch version | Elastic version - | Package URL |
---|---|---|
2.x | 3.0 |
gopkg.in/olivere/elastic.v3 (source doc) |
1.x | 2.0 |
gopkg.in/olivere/elastic.v2 (source doc) |
0.9-1.3 | 1.0 |
gopkg.in/olivere/elastic.v1 (source doc) |
Example:
You have Elasticsearch 1.7.3 installed and want to use Elastic. As listed above, you should use Elastic 2.0. So you first install Elastic 2.0.
$ go get gopkg.in/olivere/elastic.v2
Then you use it via the following import path:
import "gopkg.in/olivere/elastic.v2"
Elastic 3.0 targets Elasticsearch 2.0 and later. Elasticsearch 2.0.0 was released on 28th October 2015.
Notice that there are a lot of breaking changes in Elasticsearch 2.0 and we used this as an opportunity to clean up and refactor Elastic as well.
Elastic 2.0 targets Elasticsearch 1.x and published via gopkg.in/olivere/elastic.v2
.
Elastic 1.0 is deprecated. You should really update Elasticsearch and Elastic to a recent version.
However, if you cannot update for some reason, don't worry. Version 1.0 is still available. All you need to do is go-get it and change your import path as described above.
We use Elastic in production since 2012. Although Elastic is quite stable from our experience, we don't have a stable API yet. The reason for this is that Elasticsearch changes quite often and at a fast pace. At this moment we focus on features, not on a stable API.
Having said that, there have been no big API changes that required you to rewrite your application big time. More often than not it's renaming APIs and adding/removing features so that we are in sync with the Elasticsearch API.
Elastic has been used in production with the following Elasticsearch versions: 0.90, 1.0-1.7. Furthermore, we use Travis CI to test Elastic with the most recent versions of Elasticsearch and Go. See the .travis.yml file for the exact matrix and Travis for the results.
Elasticsearch has quite a few features. A lot of them are not yet implemented in Elastic (see below for details). I add features and APIs as required. It's straightforward to implement missing pieces. I'm accepting pull requests :-)
Having said that, I hope you find the project useful.
The first thing you do is to create a Client. The client connects to Elasticsearch on http://127.0.0.1:9200 by default.
You typically create one client for your app. Here's a complete example.
// Create a client
client, err := elastic.NewClient()
if err != nil {
// Handle error
}
// Create an index
_, err = client.CreateIndex("twitter").Do()
if err != nil {
// Handle error
panic(err)
}
// Add a document to the index
tweet := Tweet{User: "olivere", Message: "Take Five"}
_, err = client.Index().
Index("twitter").
Type("tweet").
Id("1").
BodyJson(tweet).
Do()
if err != nil {
// Handle error
panic(err)
}
// Search with a term query
termQuery := elastic.NewTermQuery("user", "olivere")
searchResult, err := client.Search().
Index("twitter"). // search in index "twitter"
Query(&termQuery). // specify the query
Sort("user", true). // sort by "user" field, ascending
From(0).Size(10). // take documents 0-9
Pretty(true). // pretty print request and response JSON
Do() // execute
if err != nil {
// Handle error
panic(err)
}
// searchResult is of type SearchResult and returns hits, suggestions,
// and all kinds of other information from Elasticsearch.
fmt.Printf("Query took %d milliseconds\n", searchResult.TookInMillis)
// Each is a convenience function that iterates over hits in a search result.
// It makes sure you don't need to check for nil values in the response.
// However, it ignores errors in serialization. If you want full control
// over iterating the hits, see below.
var ttyp Tweet
for _, item := range searchResult.Each(reflect.TypeOf(ttyp)) {
if t, ok := item.(Tweet); ok {
fmt.Printf("Tweet by %s: %s\n", t.User, t.Message)
}
}
// TotalHits is another convenience function that works even when something goes wrong.
fmt.Printf("Found a total of %d tweets\n", searchResult.TotalHits())
// Here's how you iterate through results with full control over each step.
if searchResult.Hits != nil {
fmt.Printf("Found a total of %d tweets\n", searchResult.Hits.TotalHits)
// Iterate through results
for _, hit := range searchResult.Hits.Hits {
// hit.Index contains the name of the index
// Deserialize hit.Source into a Tweet (could also be just a map[string]interface{}).
var t Tweet
err := json.Unmarshal(*hit.Source, &t)
if err != nil {
// Deserialization failed
}
// Work with tweet
fmt.Printf("Tweet by %s: %s\n", t.User, t.Message)
}
} else {
// No hits
fmt.Print("Found no tweets\n")
}
// Delete the index again
_, err = client.DeleteIndex("twitter").Do()
if err != nil {
// Handle error
panic(err)
}
See the wiki for more details.
Here's the current API status.
Not implemented. Those are better suited for operating with Elasticsearch on the command line.
match
multi_match
bool
boosting
common_terms
constant_score
dis_max
filtered
fuzzy_like_this_query
(flt
)fuzzy_like_this_field_query
(flt_field
)function_score
fuzzy
geo_shape
has_child
has_parent
ids
indices
match_all
mlt
mlt_field
nested
prefix
query_string
simple_query_string
range
regexp
span_first
span_multi_term
span_near
span_not
span_or
span_term
term
terms
top_children
wildcard
minimum_should_match
multi_term_query_rewrite
template_query
and
bool
exists
geo_bounding_box
geo_distance
geo_distance_range
geo_polygon
geoshape
geohash
has_child
has_parent
ids
indices
limit
match_all
missing
nested
not
or
prefix
query
range
regexp
script
term
terms
type
Scrolling through documents (e.g. search_type=scan
) are implemented via
the Scroll
and Scan
services. The ClearScroll
API is implemented as well.
Read the contribution guidelines.
Thanks a lot for the great folks working hard on Elasticsearch and Go.
MIT-LICENSE. See LICENSE or the LICENSE file provided in the repository for details.