With databoxx you can quickly and easily collect and modify content and information from several different data sources via a central GUI, create new content and export it to all desired channels, as well as write it back to the original data source.
With databoxx we offer the possibility of a flexible and, if necessary, even explorative holistic mapping of new business cases within the existing system architecture / application landscape. In this way, companies can follow emerging trends in the market very quickly or develop their own services very quickly and differentiate themselves from their competitors.
By flexibly connecting existing systems, customers can pursue a "best of breed" approach to special applications and avoid not being able to use other solutions from another provider, which can happen in isolated cases with integrated platform providers.
- to integrate existing special systems with each other in an AI-supported virtual and automated/self-learning way
- a completely new user experience with business-case-related UI
- to route the merged, possibly supplemented data directly via API gateway (configurable) to connected target systems
- Flexibly integrate manufacturers, suppliers or dealers and define them as data recipients or data suppliers
- the harmonization of sources of supply
- Data availability in real-time
One of the basic concepts of databoxx is the ability to integrate previously isolated systems via data virtualization without moving the data contained within them.
- The databoxx accesses the respective original data of the connected data source in real time , so that synchronization is not necessary.
- To achieve this without extensive development effort, standard data source proxies and AI-supported metadata generation and matching procedures are used to link the data from the connected silo systems.
Via previously described and KI-supported metadata generation and matching procedures, metadata is generated from the images.
Both exif data on the images and information obtained from the images themselves (recognition), such as keywords/texts, objects, people, scenes and activities are taken into account.
In this way, information is used for the automatic matching to the articles. Both identifiers found and semantic matches are taken into account. Thus the assignment is done automatically.
Mass imports are not necessary due to the virtualization integration with databoxx, which can access the original article data in the source system in real time. However, if mass imports are required for assets, this has so far been successfully performed using a replication system.
When assets are imported, a hash value is used to filter for pixel-like duplicates. The import is not performed. The procedure for duplicate detection is configurable.
We support in DAM all data formats for import. The only limitation is the creation of a preview image.
With the Pixelboxx Framework "Pixelbridge" all web and print formats can be played.
The claim of databoxx is to be able to virtually integrate any data source. To meet this claim, the mapping of objects from the source systems is done in a generic form (lists of key-value pairs, analogous to JSON).
The connection between the data source and the databoxx is established via a so-called Data-Source-Proxy, which, among other things, converts the source object into the generic object format of the databoxx and back.
A modified data model in one of the connected source systems therefore only needs to be configured in the corresponding proxy.
Any necessary data transformations of summarized data can be performed using simple worker services and persisted in the databoxx in generic form.
In addition, a transformation of the data for output to a connected target system can be performed via a configurable API gateway.