Using Python I handled to create myself a type of dictionary of terms as well as their meaning and it is rather large - x00,000 products (can't estimate at this time because they are saved in multiple files beginning with letter).
Files are pickled dictionary objects with this particular structure:

dict{word, (attribute,
            kind,
            [meanings],
            [examples],
            [connections]
            )
    }

Whether it matters it's Python dictionary object, with key as string and cost as tuple, after which this tuple includes either string or list objects.

Now I intend to place them all in sqlite3 database as it is easy with Python. Before I actually do which i considered to request for advice if sqlite3 if sensible choice as I have never done any real database task before.

I understand that answer is dependent of the items I would like related to this data (besides it's structure), but let us say Among the finest so that it is saved in your area in one location (file) and become reasonable easily accessible (query) and perhaps transform.

Yes, I have used sqlite3 with this type of factor. The dictionary values needed to first be pickled though:

import sqlite3
import pickle
import collections

class DBDict(collections.MutableMapping):
    'Database driven dictlike object (with non-persistent in-memory option).'

    def __init__(self, db_filename=':memory:', **kwds):
        self.db = sqlite3.connect(db_filename)
        self.db.text_factory = str
        try:
            self.db.execute('CREATE TABLE dict (key text PRIMARY KEY, value text)')
            self.db.execute('CREATE INDEX key ON dict (key)')
            self.db.commit()
        except sqlite3.OperationalError:
            pass                # DB already exists
        self.update(kwds)

    def __setitem__(self, key, value):
        if key in self:
            del self[key]
        value = pickle.dumps(value)
        self.db.execute('INSERT INTO dict VALUES (?, ?)', (key, value))
        self.db.commit()

    def __getitem__(self, key):
        cursor = self.db.execute('SELECT value FROM dict WHERE key = (?)', (key,))
        result = cursor.fetchone()
        if result is None:
            raise KeyError(key)
        return pickle.loads(result[0])

    def __delitem__(self, key):
        if key not in self:
            raise KeyError(key)
        self.db.execute('DELETE FROM dict WHERE key = (?)', (key,))
        self.db.commit()

    def __iter__(self):
        return iter([row[0] for row in self.db.execute('SELECT key FROM dict')])

    def __repr__(self):
        list_of_str = ['%r: %r' % pair for pair in self.items()]
        return '{' + ', '.join(list_of_str) + '}'

    def __len__(self):
        return len(list(iter(self)))



>>> d = DBDict(raymond='red', rachel='blue')
>>> d
{'rachel': 'blue', 'raymond': 'red'}
>>> d['critter'] = ('xyz', [1,2,3])
>>> d['critter']
('xyz', [1, 2, 3])
>>> len(d)
3
>>> list(d)
['rachel', 'raymond', 'critter']
>>> d.keys()
['rachel', 'raymond', 'critter']
>>> d.items()
[('rachel', 'blue'), ('raymond', 'red'), ('critter', ('xyz', [1, 2, 3]))]
>>> d.values()
['blue', 'red', ('xyz', [1, 2, 3])]

The above mentioned could keep you database in one file. You are able to navigate the item just like a regular python dictionary. Because the values are pickled in one area, sqlite will not provide you with any extra query options. Other flatfile storage may have similar limitations. If you want to write queries that traverse a hierarchical structure, get a NoSQL database rather.

Has the aroma of a document store database in my experience. Take a look at CouchDB http://couchdb.apache.org/