Log Data Source
Log Data Source window
The Log Data Source window is Teneo Studio's querying center; to learn how to open a Log Data Source, see here.
In this window it is possible to:
- perform queries
- see the query history and export query results
- share queries
- view Session data
- synchronize the Log Data Source with the logs imported in the associated Log Archive
- manage augmenters
- manage saved results
- see the overview of Log Data Source sessions and Log Archive sessions
- manage solutions to perform queries with Engine Matching.
The Log Data Source window hosts the querying and result management features, as well as the Session Viewer, the Recent Queries and Shared Queries panels among others.
The window consists of two tabs: Log Data Source and Home, both are described further in the following sections.
The Home tab provides the following sections:
- Top ribbon containing:
- The Source and Tabs areas offer buttons to close the window, save queries and cancel
- The Query area offers buttons to create, share and save queries
- The Result area allows to export query results in .csv or .json formats
- The Session Viewer area to create a new Session Data tab
- The Query command bar, or TQL editor, to type in the Teneo Query Language queries
- The Queries panel displays:
- Shared Queries and
- Recent Queries performed while the LDS is open, along with the summary of the query results and other execution information.
The window, before running any queries, also contains a summary of the Basic Query Commands.
For more details on working with queries, please see the Query Log Data page.
Recent Queries panel
The Recent Queries panel displays the list of the queries performed while the LDS is open, along with a summary of the query results and other execution information.
The user can access any of the recent queries by double-clicking the one to open.
Shared Queries panel
The share queries feature allows to save queries in Teneo Query Language and share them with other users; note that queries are shared at LDS level, so only users with access to the specific LDS will be able to see them.
Create a Shared Query
To create a new shared query, after typing in the query, click the Share button in the top ribbon. Alternatively, click Share first to open the dialogue window to create a shared query.
A dialogue window opens in order to create the shared query.
The following information should be added:
- Name of the shared query, note that the name needs to be unique
- Description of the shared query
- The TQL Query string to be shared (if it hadn't been introduced already).
The query to share is marked with a star, until the user clicks Save.
Once created, a Shared Query can be run, edited or removed at any time by any developer using the buttons at the top of the Shared Queries panel.
A Shared Query can also be accessed by double-clicking.
Queries are saved as children of the Log Data Source; therefore deleting the Log Data Source also deletes any existing Shared Query.
History and restoration
It is possible to see the history of Shared Queries by clicking the History button on the right side of the query dashboard. Note that the History button only appear when more than one version of the query has been saved.
The developer can also restore an older version of a query by using the Restore button located at the right end of the older version. The system will prompt the developer to confirm the operation.
Shared queries can also be published into Inquire; the TQL query string is then stored in Inquire along with its name. That way Inquire clients, such as dashboards, can use queries' name to run TQL queries more easily. Note that published queries can only be inspected from Teneo Studio.
In order to publish a query, simply click the Publish button.
Teneo Studio opens a dialogue window for the user to type the desired publishing name. Note that neither blank spaces nor any other special character is allowed. Click Publish again below the query to publish into Inquire.
An Unpublish button also appears when the publication has been performed.
Access and user permissions
While a shared query is being edited or published, it will be locked to prevent other users from modifying it meanwhile. A small lock on the right side will indicate this.
A shared query is visible to all users with access to the Log Data Source, and any user with access to the Log Data Source can run a shared query.
- Global users may edit (i.e. add, delete, change) any shared query on any Log Data Source as long as they have the permission to open the Log Data Source query window
- Non-global users may edit any shared query when they are the solution owner of the solution linked to the open Log Data Source.
Read more about user permissions.
Different users can modify the same Log Data Source and common elements such as shared queries, and as these changes impact the rest of the users, to draw their attention to this, whenever users has a Log Data Source open, they will receive notifications when any of the shared queries are modified by other users.
Session Data Viewer
From the Log Data Source's homepage, it is possible to access relevant session data - such as session, transaction and event properties, variables or metadata - in a visual, user-friendly view using the Session Data Viewer.
To open the Session Data Viewer, simply click the New Session Viewer Tab in the top ribbon of the Log Data Source. A new Session Viewer tab will open.
Order by and Limit
Use the Order By menu on the right to select how the results are displayed; choose between:
- Query Order
- Transaction Count
- Start Date
- End Date
- Event Count
- Session Id
The user may also set the results by Ascending or Descending order, and Limit the number of results per page.
Optionally, the user may enter TQL constraints in the command box to narrow the search.
Once the parameters are set, click the Run Query button. The system will prompt the user to select one of the sessions from the Log Data Source to proceed.
Lastly, the user will be presented with the information related to the selected session.
Learn more about the displayed information visiting the Session Data Model page.
Find and Filter
It is possible to refine the display data by using the Find and Filter buttons:
- Find keeps all events visible, navigates the user to view the next instance of the selection
- Filter removes all events not selected in the property list, navigates the user to the first occurrence.
Log Data Source tab
The Log Data Source tab provides the following options all described in the following, below sections:
- Overview provides the graph of available sessions, the covered weeks as well as the status and configuration of the Log Data Source
- Synchronize Log Data Source with the logs imported in the associated Log Archives
- Augmenters allows to manage adorners and aggregators
- Saved Results view, export, upload or delete saved results
- Solutions which provides information related to the queryable solutions available
The Overview page in the backstage of the Log Data Source window (Log Data Source tab > Overview) summarizes the contents of the Log Data Source, mainly the range of logs contained in it, as well as the overview and table (also available before opening the LDS window) with information regarding the sessions contained in the particular Log Data Source and its associated Log Archive.
The Synchronize page allows developers to synchronize any Log Data Source to the latest log content stored in the Log archive in Cassandra.
To synchronize the content, following the below steps:
- Go to the backstage of the Log Data Source window (Log Data Source tab)
- Select Synchronize in the left side menu
- Click Sync to start the synchronization
Teneo Studio starts the process and displays the synchronization progress in the Progress view.
Note that augmenters are applied each time a Log Data Source is synchronized.
Should any errors occur during the synchronization, the system will show an informative message and displays a list of the errors at the bottom of the Synchronize page.
More information is available for each error by clicking the View button. Removing all errors can be done by clicking Clear All.
Clear LDS contents
Finally, it is possible to clear the contents in the Log Data Source; to do so follow the below steps:
- In the backstage of the Log Data Source window (Log Data Source tab), select Synchronize
- Click Clear
- In the Clear Data prompt, write the name of the Log Data Source
- Click Clear Data to confirm the action
The Augmenters section allows to create and manage augmenters which are then applied to logs each time a session is imported to Elasticsearch or when a Log Data Source is synchronized to enrich or summarize log data in a format that can later be accessed by standard queries. There are two types of augmenters in Teneo: Adorners and Aggregators.
Below please find a brief overview of the Augmenters section; more details (i.e., how to create augmenters and apply them) are available here.
The following information related to Augmenters is available:
- if 100% is displayed, the current configuration of the augmenter is applied in the Log Data Source
- if 0% is displayed, the Log Data Source needs to be updated to reflect the latest changes in the augmenter; also see Pending Actions
- Name refers to the name of the augmenter
- Version is the version of the augmenter, any time the augmenter is modified (including enable and disable actions), the version is incremented
- Type states if the augmenter is TQL or Groovy script
- Keys refer to the augmenter's queryable properties, they are generated when the augmenter is created and are unique
- Created specifies the date of creation
- Updated specifies the date of the last update to the augmenter, this does not imply that the Log Data Source contains the latest version of the augmented data; also see Pending Actions
- Status: Enabled or Disabled; if disabled, the augmenter will not be applied the next time the Log Data Source is updated; also see Pending Actions
Note that it is possible to Disable/Enable, Edit and Delete an augmenter by using the buttons available at the far right side of the augmenter by clicking the down arrow; learn more in the Augmenters page.
When an augmenter is created, imported, edited or enabled/disabled, usually the changes are automatically applied to the augmenter configuration stored in Elasticsearch. However, the data generated by the augmenters is not modified in the Log Data Source until it is manually updated in the Pending Actions section.
Similarly, every session that is generated on a published solution is imported and augmented with the version of the augmenters that are configured at that point in time. As a result, if a developer updates an augmenter but doesn't apply the changes on the Log Data Source, new sessions will be augmented using the new version of the augmenter while all the old sessions already imported will remain augmented with the old version.
To apply the new augmenter to older sessions, the developer should click the Perform or Perform all button; where Perform allows to apply only the selected augmenter while Perform all applies all pending actions.
Remember that the augmenter data available for querying is not up-to-date if there is any item in the pending action list.
In the Home tab of the Log Data Source window, the developer can save query results by exporting them as .csv or .json to be able to reuse these in subsequent queries; learn how to Export query results
Saved Results are static, i.e., they are not updated when a Log Data Source is synched or its configuration is changed.
In the Saved Results page in the backstage of the Log Data Source window (Log Data Source tab > Saved Results), the developer can mange the Saved Results available in the Log Data Source.
Export Saved Results
It is possible to export any of the Saved Results to a .csv or .json file format using the Export button available at the far right side of a specific Saved Result. Do do so, simply click Export and select between csv or json.
It is also possible to export a selected or all Saved Results as a .zip file using the Export button available in the top of Saved Results view.
Import new Saved Results
The developer can also import new Saved Results, if these have been exported previously in .json format, by using the Import Result Data button at the upper, right corner of the window.
The Saved Results' Import Result Data view opens, and the developer must write the name of the new Saved Results as well as select the file to import; finish by clicking Execute.
Queries can be run against the Engine using existing solution .WAR files, which allows to add an extra layer of Natural Language Processing to the query by matching the searched with user inputs, Language Objects, etc.
To manage queryable solutions, go to the Log Data Source window's backstage and select Solutions.
The user may import, delete or reload solutions for Engine matching using the buttons in the top right corner.
Only solutions from the same Platform version are supported
Solutions for Engine matching need to be of the same version as the used Teneo Platform version. If they are from a previous version of the Platform, the solutions need to be published as web archives (.war format) in the current Platform version before re-importing them for querying.
Import of WAR file
To perform analyses on Log Data making use of solution .war files, the developer should first create a web archive (.war) file of the solution and its associated lexical resources; find details on how to publish as web archive here to obtain the needed file.
Next, follow the below steps:
- Go to the Log Data panel (Solution tab > Optimization > Log Data)
- Open the desired Log Data Source
- Now, in the backstage of the Log Data Source window (Log Data Source tab > Solutions), import the .war file:
- Click Import Solution
- Browse to the .war file and select it
- Click Open
- Back in the Log Data Source window, give the solution an alias; the alias is used to refer to this solution in queries
- Lastly, click Upload.
Teneo uploads the solution to the Log Data Source and when finished, the solution is available for log data analysis.