A SQL Query walks into a bar.
In one corner of the bar are two tables.
The Query walks up to the tables and asks:
– Mind if I join you?
Some of the latest blog and videos below for your Monday enjoyment. Have a good week!
Blogs of the week
Julian Dontcheff begins by saying, ‘”The way that worms and viruses spread on the Internet is not that different from the way they spread in the real world, and the way you quarantine them is not that different, either” – David Ulevitch
And now, in Oracle 19c, you can do the same with SQL’
He includes this video:
Jonathan Lewis starts by saying, “Editorial note: this is something I started writing in 2013, managed to complete in 2017, and still failed to publish. It should have been a follow-on to another posting on the oddities of timestamp manipulation.]”
He concluded this blog with “Oracle has silently invoked the sys_extract_utc() function on our (free-floating) timestamp column to normalize it to UTC. This is really not very friendly but it does make sense, of course – it would be rather expensive to enforce uniqueness if there were (at least) 24 different ways of storing the same absolute value – and 24 is a conservative estimate.”
Brendan Tierney opens with, “In a previous post I showed how you can normalize data using the in-database machine learning feature using the DBMS_DATA_MINING.TRANSFORM function. This same function can be used to perform many more data transformations with standardized routines. When it comes to missing data, where you have some case records where the value for an attribute is missing you have a number of options open to you. The first is to evaluate the degree of missing values for the attribute for the data set as a whole. If it is very high, you may want to remove that attribute from the data set. But in scenarios when you have a small number or percentage of missing values you will want to find an appropriate or an approximate value. Such calculations can involve the use of calculating the mean or mode.”
Liron Amitzil writes, “I’ve worked with OEM in the past, but these days I’m working with it much more than before. I have a client who uses it as their main monitoring system, so we upgraded it to 13.2, and since then I created custom alerts (metric extensions), installed patches, used it to diagnose issues and more. It was always quite complicated to use, but that makes sense as it has lots and lots of features and capabilities. However, some things just don’t seem right. I wonder how many people are using it as their main monitoring system and if it’s quite a lot, do they suffer from the same things or it is just me.”
Ulrich Janke opens with, “In various blog posts we discussed the usage of BI Cloud Connector (BICC) in Oracle SaaS. In our cloud applications for ERP, HCM, EX etc exist various options to store flexible data as we know. One option follows the concept of flexible fields as known as Flexfields. Those are either Key Flexfields (KFF), Descriptive Flexfields (DFF) or Extensible Flexfields (EFF). These data are stored in database tables where the structures are fixed (columns named SEGMENT, ATTRIBUTE etc) and the content can be flexibly defined. BICC will handle the extraction of these flexible values in the same way as all other data: the underlying database tables are exposed as VO’s (View Objects) and whenever a flexfield is part of that structure it can be found in the according view object depending on the requirement to be published or not.”
John Goodwin starts by writing, “The 19.05 EPM Cloud release brought new functionality into Data Management which provides the ability to set workflow modes. The documentation provides the following information on the workflow modes.
“By default, the data load process in Data Management is designed with a well-defined process flow that provides a full audit of the data load process and the ability to drill down and view data in Workbench. However, a full data flow may contribute to lengthy processing times due to archiving of data for audit purposes. Workflow mode options provide scalable solutions when processing large volumes of data, or when an audit is not required and performance is a key requirement.“
Martin Giffy D’Souza advises, “Prior to 12.2 validating dates and numbers was a bit of a pain as you had to write your own custom PL/SQL function. OOS-Utils has a quick solution to this issue and I recommend using
oos_util_validation.is_date. If you’re using Oracle 12.2 or above you can (and should) use
validation_conversion instead. ”
Connor McDonald shares this video:
Francisco Munoz Alvarez writes, “The Database Security Assessment Tool (DBSAT) is an incredible free command line tool provided by Oracle Corporation as a utility to help you verify your database for common database security issues(including security policies and controls in place), as well as helping to identify possible sensible data stored with the database. To be able to use it you need to have a valid Oracle Support CSI and download it from My Oracle Support [Doc ID 2138254.1]”
Marcelo Ochoa starts his blog by writing, “When moved to the Cloud one of the first question (forget the cost) is how can I predict my IO performance? this is because nobody knows which hardware is used under the hood when you create a compute instance.”
This week on Twitter
Dirk Nachbar posted Oracle 19c for AIX and HP-UX available
Oracle User Groups posted the date (4th June) for the “Explore Oracle PL/SQL” session. Details are here.
Videos such as: