Quantcast
Channel: MarsHut
Viewing all articles
Browse latest Browse all 6551

Indexing a collections with many fields.

$
0
0
Hi

I am using mongo to store proxy logs.

NoSQL is an ideal db for proxy logs, as I will end with millions of
relatively flat records with no relationships coming from many locations
(using tag aware sharding)

I noticed that querying the database is very slow once there are in excess
of a million records even though I had some "key" indexes in place.

I assumed that if I index some fields that would be mandatory in every
query than Mongo would use this to minimize the amount of data it had to
search through

For example I could create a relatively small index for timestamp and ip
address that would be used in every query and this should vastly reduce the
amount of records Mongo will have to search though.

I noticed that this didn't seem to be working.

Then I read that "A covered query is a query in which .... all the fields
returned in the results are in the same index".

Does this mean that a query cannot be used to reduce the amount of data
Mongo has to sift through and it's either all or none.

There are so many permutation of how a user might search my database and I
have around 30-40 fields in the collection I cannot index them all.

Am I missing something here?

Thanks

Daniel

Viewing all articles
Browse latest Browse all 6551

Latest Images

Trending Articles



Latest Images