01:37:40 GMT hello, I'm new to redis, I'm trying to make a sort of blog with categories and posts, what do you think about this? http://pastebin.com/rujxwt4S and how would you do it? 01:43:15 GMT gangstacat: please don't private message random people on IRC, it is considered bad behaviour 01:43:35 GMT gangstacat: can you please use zerobin.net instead? I don't click on pastebin.com links 01:44:41 GMT sorry kulelu88, https://www.zerobin.net/?e0e7a175c9f297e1#htZq9i3hxD6kT0eZW55Y13nycuuq/NqPyEueJfi2UWk= 01:46:17 GMT gangstacat: how many key-values does your data-structure have? 01:47:36 GMT not many 01:47:52 GMT like 4-5 ? I use hashes mostly 01:48:30 GMT if there weren't categories that could be easy to do but that poses me a problem because there is no foreign key thing or WHERE clause 01:48:40 GMT like on SQL 01:49:18 GMT if I make a hash for each post with a field "category" and the id of the cat, I can't list all the posts with this cat 01:50:08 GMT gangstacat: are you attempting to use redis to cache your regular DB? 01:50:17 GMT to replace the regular DB 01:50:53 GMT why? (I am asking these questions because perhaps redis is not the right solution to replace something like this) 01:51:19 GMT just want to try something new and fast 01:51:51 GMT I recommend you don't rewrite your database to use Redis and instead use it as a caching-layer 01:52:25 GMT I don't rewrite anything because this is a little project to learn redis, I just want to know how can I model the data to make a blog with categories if that's possible 01:53:13 GMT gangstacat: what language are you using ? 01:53:37 GMT Nim, but this is not the point, I'm mostly playing with redis-cli for the moment 01:53:53 GMT HSET categories [id] [name of the category] - this creates a hash 01:54:09 GMT You need to use SET for key-values 01:54:26 GMT I tried to replicate the logic of http://redis.io/topics/twitter-clone 01:54:41 GMT "HSET users antirez 1000" 01:55:04 GMT where they create a hash for users with username and id 01:55:15 GMT like an index if we want 01:56:06 GMT and the hash categories is just for storing the human readeable category's name 01:56:10 GMT okay so you want to replicate the code from that twitter clone. show me your current Nim code. I can read it (somewhat) and assist 01:56:24 GMT I haven't one yet 01:56:49 GMT the fact is I'm trying to find the right model before implementing 01:57:09 GMT so you want: Users, Categories and Posts ? 01:57:31 GMT not users, keep it simple for now 01:57:54 GMT Posts and Categories ? 01:57:54 GMT just create posts that belongs to a category, on which a page could be dedicated in order to list all the category's posts 01:57:56 GMT yes 01:58:40 GMT I wanted to know if my logic is good and how would you do it otherwise 01:59:34 GMT Well I would do it like this: each post could be a key-value. Each category would be a list containing each post. that way, the posts will already be ordered in the category (that is how lists work) 02:00:09 GMT how do you store title's post? 02:01:09 GMT I recommend you take a look here for that: http://redis.io/topics/data-types-intro 02:01:09 GMT but for the categories, yes I did it this way, with ZADD timestamp post_id 02:03:36 GMT ok.. thanks 02:50:31 GMT Hey all - I have a machine with 64G of ram, and I have two flat files of small strings (all between 6-15 length) that are line separated. The total file size for both of these is 25.2G on the hard drive - does anyone have a rough estimate for the minimum amount of storage this would cost in RAM for redis, and what would be the most lightweight storage system for these? 08:15:28 GMT dgaff: compressing them and putting them in one key would probably the cheapst way. joking aside, it totally depends on how you access them; the most efficient format is storing them in a hash (or multiple hashes, aka buckets) afaik 08:17:33 GMT dgaff: http://redis.io/topics/memory-optimization 08:18:32 GMT dgaff: one thing is sure though: it's definitely going to require more memory than it does in a text file on disk because a) you additionally need key names b) redis is not compressed in-memory, only its dumps are 15:38:54 GMT minus: Thanks for the tips - I let msets run overnight to push the ids into Redis.... and I got to 16% until the machine memory thrashed 15:39:05 GMT The page you pointed to, though, looks hopeful