Default Filter Criteria - "Current (Indexed)" Column?
Hello Technet,
When you create a "Standard View" in SharePoint online, if you look under the Filter section, a default filter is automatically added on the "Current (Indexed)" column, causing the list to show only list items that have Current = Yes.
My question is in two parts;
What does this Current (Indexed) column mean/do?
Why, as my list approached 2500 items, are many of the newly created list items not showing in views that have this criteria?
I am unable to find that column when I modify the list so I assume it has something to do with SharePoint's automatic version control. I'm not getting why all of a sudden this would not be working? One thought, I do know that SharePoint lists
are limited to a 5,000 item limit, is it possible the "versions" list, if that exists on the back-end, has gotten too large and is no longer saving items?
As there is no default view with a filter already enabled OOTB I would suggest you to create a service ticket at O365 support or via MS Premier Support.
Michiel Hamers www.SharePointman.nl Don't hesitate to contact me for a SharePoint/O365 question.
Similar Messages
-
Filter Criteria will not save in SharePoint Designer
I have a web page that contains form webpart (for entering search keywords) and a dataview with successful configuration to sql database table.
In SPD > dataview > Common Tasks:
I have created parameter with control type and texbox control ID. Then setup the web part connection to link both web parts.
In the final step I then try to set the filter criteria and choosing columns with available option of parameter name and click ok. The filter values disappear.
I have followed the steps from the following link:
http://andyparkes.co.uk/blog/index.php/2008/10/31/creating-a-multi-column-search-form-in-sharepoint-v3-with-sharepoint-designer/
Why does the Filter criteria not save?
Thanks in advance.
-- techChiragHey, I dont suppose anyone's found a solution to this yet? I've succesfully done this on a few different Site Collections, but for some reason it's not working on my latest one. As above, after setting the Filter Criteria and saving the page in SPD2010,
the filter criteria just dissappears. Random! -
How to refresh/apply column value default setting on current files or folders
Hi All
I have set-up default column data per folder in my library (via
Library Settings > Column default settings) and it works great for new documents or folders that are added to the library.
But what do I do if I have an existing Library with folders and files and need to apply default column data to each? Is there a way of "refreshing" the default columns so that the data is populated through a specific folder and/or its sub-folders?
(I really hope this is an easy fix or just a setting that I over-looked somewhere!)
Thank you!I had to do this as well recently, and remembered your post.
Here is the function I wrote , this worked for text, choice, and metadata columns
It is pretty slow and could be optimized and broken up into more functions, but I had to do several things:
1. I mass-updated the content types in a library
2. On Library settings : set default values and also different defaults per folder
3. For each file I then needed to:
3.a. either copy the value from a column in the old content type to the new, or
3.b. set the column to the default
so this function does step 3, like I said it works for certain types of columns and can be sped up (I used it to update 700 files in a couple minutes), and it makes some assumptions about the environment but this at least is a starting point.
As Alex said you may want to change SystemUpdate($true) to just Update(), depending on your requirements.
<#
.SYNOPSIS
Resets columns in a document library to defaults for blank columns. Use this
after changing the content types or adding columns to a doc lib with existing files
.DESCRIPTION
Resets columns in a doc lib to their defaults. Will only set them if the columns are blank (unless overridden)
Will also copy some values from one column to another while you are there.
Can restrict the update to a subset of columns, or have it look for all columns with defaults.
Will use the list defaults as well as folder defaults.
All names of columns passed in should use InternalName.
This has ONLY been tested on Text, Choice, Metadata, and mult-Choice and Mult-Metadata columns
Pass in a list and it will recursively travel down the list to update all items with the defaults for the items in that folder.
If you call it on a folder, it will travel up the tree of folders to find the proper defaults
Author:
Chris Buchholz
[email protected]
@plutosdad
.PARAMETER list
The document library to update. Using this parameter it will update all files in the doc lib
.PARAMETER folder
The folder containing files to update. Function will update all files in this folder and subfolders.
.PARAMETER ParentFolderDefaults
Hashtable of internal field names as KEY, and value VALUE, summing up all the parent folders or list defaults.
If not supplied, then the function will travel up the tree of folders to the parent doclib to determine
the correct defaults to apply.
If the field is managed metadata, then the value is a string
Currently only tested for string and metadata values, not lookup or date
.PARAMETER termstore
The termstore to use if you are going to update managed metadata columns, this assumes we are only using the one termstore for all columns to update
If you are using the site collection specific termstore for some columns you want to update, and
the central termstore for others, then you should call this method twice, once with each termstore,
and specify the respective columns in fieldsToUpdate
.PARAMETER fieldsToCopy
Hashtable of internal field names, where KEY is the "to" field, and VALUE is the "from" field
Use this to copy values from one field to another for the item.
These override the defaults, and also cause the "from" (Value) fields to NOT be overwritten with defaults even if
they are in the fieldsToUpdate array.
Example: @{"MyNewColumn" = "My_x0020_Old_x0020_Column"}
.PARAMETER fieldsToUpdate
If supplied then the method will update only the fields in this array to their default values, if null then it will update
all fields that have defaults.
If you pass in an empty array, then this method will only copy fields in the fieldtocopy and not
apply any defaults
Example: @() - to only copy and not set any fields to default
Example2: @('UpdateField1','UpdateField2') will
.EXAMPLE
Set-SPListItemValuesToDefaults -list $list -fieldsToCopy @{"MyNewColumn" = "My_x0020_Old_x0020_Column"} -fieldsToUpdate @() -overwrite -termStore $termStore
This will not set any defaults, but instead only set MyNewColumn to non null values of My_x0020_Old_x0020_Column
It will overwrite any values of MyNewColumn
.EXAMPLE
Set-SPListItemValuesToDefaults -list $list -overwrite
This will set all columns to their default values even if they are filled in already
.EXAMPLE
Set-SPListItemValuesToDefaults -folder $list.RootFolder.SubFolder[3].SubFolder[5]
This will set all columns to their defaults in the given subfolder of a library
.EXAMPLE
Set-SPListItemValuesToDefaults -list $list -fieldsToUpdate @('ColumnOneInternalName','ColumnTwoInternalName')
This will set columns ColumnOneInternalName and ColumnTwoInternalName to their defaults for all items where they are currently null
.EXAMPLE
Set-SPListItemValuesToDefaults -list $list -fieldsToCopy @{"MyNewColumn" = "My_x0020_Old_x0020_Column"} -fieldsToUpdate @("MyNewColumn") -termStore $termStore
This will set all MyNewColumn values to their default, and then also copy the values of My_x0020_Old_x0020_Column to MyNewColumn where the old column is not null,
but both of these will only happen for items where MyNewColumn is null
.EXAMPLE
Set-SPListItemValuesToDefaults -list $list -fieldsToCopy @{"MyNewColumn" = "My_x0020_Old_x0020_Column"} -termStore $termStore
This will set ALL columns with defaults to the default value (if the item's value is null),
except for My_x0020_Old_x0020_Column which will not be modified even if it has a default value, and will also set MyNewColumn to the
value of My_x0020_Old_x0020_Column if the old value is not null
#>
function Set-SPListItemValuesToDefaults {
[CmdletBinding(SupportsShouldProcess=$true)]
param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true,ParameterSetName="List")][Microsoft.SharePoint.SPList]$list,
[Parameter(Mandatory=$true,ValueFromPipeline=$true,ParameterSetName="Folder")][Microsoft.SharePoint.SPFolder]$folder,
[Parameter(Mandatory=$false,ParameterSetName="Folder")][HashTable]$ParentFolderDefaults,
[Parameter(Mandatory=$false)][HashTable]$fieldsToCopy,
[Parameter(Mandatory=$false)][Array]$fieldsToUpdate,
[Parameter(Mandatory=$false)][Microsoft.SharePoint.Taxonomy.TermStore]$termStore,
[Switch]$overwrite,
[Switch]$overwriteFromFields
begin {
#one or both can be null, but if both empty, then nothing to do
if ($null -ne $fieldsToUpdate -and $fieldsToUpdate.Count -eq 0 -and
( $null -eq $fieldsToCopy -or $fieldsToCopy.Count -eq 0)) {
Write-Warning "No fields to update OR copy"
return
if ($PSCmdlet.ParameterSetName -eq "Folder") {
$list = $folder.DocumentLibrary
if ($null -eq $termStore ) {
$taxonomySession = Get-SPTaxonomySession -site $list.ParentWeb.Site
$termStores = $taxonomySession.TermStores
$termStore = $termStores[0]
#if we did not pass in the parent folder defaults then we must go backward up tree
if ($PSCmdlet.ParameterSetName -eq "Folder" -and $null -eq $ParentFolderDefaults ) {
$ParentFolderDefaults = @{}
if ($null -eq $fieldsToUpdate -or $fieldsToUpdate.Count -gt 0) {
write-Debug "ParentFolderDefaults is null"
$tempfolder=$folder.ParentFolder
while ($tempfolder.ParentListId -ne [Guid]::Empty) {
Write-Debug "at folder $($tempfolder.Url)"
$pairs = $columnDefaults.GetDefaultMetadata($tempfolder)
foreach ($pair in $pairs) {
if (!$ParentFolderDefaults.ContainsKey($pair.First)) {
Write-Debug "Folder $($tempfolder.Name) default: $($pair.First) = $($pair.Second)"
$ParentFolderDefaults.Add($pair.First,$pair.Second)
$tempfolder = $tempfolder.ParentFolder
#listdefaults
Write-Debug "at list"
foreach ($field in $folder.DocumentLibrary.Fields) {
if ($field.InternalName -eq "_ModerationStatus") { continue }
#$field = $list.Fields[$name]
if (![String]::IsNullOrEmpty($field.DefaultValue)) {
#Write-Verbose "List default found key $($field.InternalName)"
if (!$ParentFolderDefaults.ContainsKey($field.InternalName)) {
Write-Debug "List Default $($field.InternalName) = $($field.DefaultValue)"
$ParentFolderDefaults.Add($field.InternalName,$field.DefaultValue)
process {
Write-Debug "Calling with $($PSCmdlet.ParameterSetName)"
Write-Debug "Parent folder hash has $($ParentFolderDefaults.Count) items"
if ($PSCmdlet.ParameterSetName -eq "List" ) {
$folder = $list.RootFolder
$ParentFolderDefaults=@{}
if ($null -eq $fieldsToUpdate -or $fieldsToUpdate.Count -gt 0) {
foreach ($field in $list.Fields) {
if ($field.InternalName -eq "_ModerationStatus") { continue }
if (![String]::IsNullOrEmpty($field.DefaultValue)) {
Write-Debug "List Default $($field.InternalName) = $($field.DefaultValue)"
$ParentFolderDefaults.Add($field.InternalName,$field.DefaultValue)
Write-Verbose "At folder $($folder.Url)"
$FolderDefaults=@{}
$FolderDefaults += $ParentFolderDefaults
if ($null -eq $fieldsToUpdate -or $fieldsToUpdate.Count -gt 0) {
$pairs = $columnDefaults.GetDefaultMetadata($folder)
foreach ($pair in $pairs) {
if ($FolderDefaults.ContainsKey($pair.First)) {
$FolderDefaults.Remove($pair.First)
Write-Debug "Folder $($folder.Name) default: $($pair.First) = $($pair.Second)"
$FolderDefaults.Add($pair.First,$pair.Second)
#set values
foreach ($file in $folder.Files) {
if ($file.CheckOutType -ne [Microsoft.SharePoint.SPFile+SPCheckOutType]::None) {
Write-Warning "File $($file.Url).CheckOutType = $($file.CheckOutType)) ... skipping"
continue
$item = $file.Item
$ItemDefaults=@{}
$ItemDefaults+= $FolderDefaults
#if we only want certain fields then remove the others
#Move this to every time we add values to the defaults
if ($null -ne $fieldsToUpdate ) {
$ItemDefaults2=@{}
foreach ($fieldInternalName in $fieldsToUpdate) {
try {
$ItemDefaults2.Add($fieldInternalName,$ItemDefaults[$fieldInternalName])
} catch { } #who cares if not in list
$ItemDefaults = $ItemDefaults2
#do not overwrite already filled in values unless specified
if (!$overwrite) {
$keys = $itemDefaults.Keys
for ($i=$keys.Count - 1; $i -ge 0; $i-- ) {
$key=$keys[$i]
try {
$val =$item[$item.Fields.GetFieldByInternalName($key)]
if ($val -ne $null) {
$ItemDefaults.Remove($key)
} catch {} #if fieldname does not exist then ignore, we should check for this earlier
#do not overwrite FROM fields in copy list unless specified
if (!$overwriteFromFields) {
if ($null -ne $fieldToCopy -and $fieldsToCopy.Count -gt 0) {
foreach ($value in $fieldsToCopy.Values) {
try {
$ItemDefaults.Remove($value)
} catch {} #who cares if not in list
#do not overwrite TO fields in copy list if we're going to copy instead
if (!$overwriteFromFields) {
if ($null -ne $fieldToCopy -and $fieldsToCopy.Count -gt 0) {
foreach ($key in $fieldsToCopy.Keys) {
$fromfield = $item.Fields.GetFieldByInternalName($fieldsToCopy[$key])
try {
if ($null -ne $item[$fromfield]) {
$ItemDefaults.Remove($key)
} catch {} #who cares if not in list
Write-Verbose $item.Url
$namestr = [String]::Empty
if ($ItemDefaults.Count -eq 0) {
write-Verbose "No defaults, copy only"
} else {
$str = $ItemDefaults | Out-String
$namestr += $str
Write-Verbose $str
if ($null -ne $fieldsToCopy -and $fieldsToCopy.Count -gt 0) {
$str = $fieldsToCopy | Out-String
$namestr +=$str
if ($PSCmdlet.ShouldProcess($item.Url,"Set Values: $namestr"))
#defaults
if ($null -ne $ItemDefaults -and $ItemDefaults.Count -gt 0) {
foreach ($key in $ItemDefaults.Keys) {
$tofield = $item.Fields.GetFieldByInternalName($key)
if ($tofield.TypeAsString -like "TaxonomyFieldType*") {
$taxfield =[Microsoft.SharePoint.Taxonomy.TaxonomyField]$tofield
$taxfieldValue = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($tofield)
$lookupval=$ItemDefaults[$key]
$termval=$lookupval.Substring( $lookupval.IndexOf('#')+1)
$taxfieldValue.PopulateFromLabelGuidPair($termval)
if ($tofield.TypeAsString -eq "TaxonomyFieldType") {
$taxfield.SetFieldValue($item,$taxfieldValue)
} else {
#multi
$taxfieldValues = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValueCollection $tofield
$taxfieldValues.Add($taxfieldValue)
$taxfield.SetFieldValue($item,$taxfieldValues)
} else {
$item[$field]=$ItemDefaults[$key]
#copyfields
if ($null -ne $fieldsToCopy -and $fieldsToCopy.Count -gt 0) {
#$fieldsToCopy | Out-String | Write-Verbose
foreach ($key in $fieldsToCopy.Keys) {
$tofield = $item.Fields.GetFieldByInternalName($key)
$fromfield = $item.Fields.GetFieldByInternalName($fieldsToCopy[$key])
if ($null -eq $item[$fromfield] -or ( !$overwrite -and $null -ne $item[$tofield] )) {
continue
if ($tofield.TypeAsString -eq "TaxonomyFieldType" -and
$fromfield.TypeAsString -notlike "TaxonomyFieldType*" ) {
#non taxonomy to taxonomy
$taxfield =[Microsoft.SharePoint.Taxonomy.TaxonomyField]$tofield
$termSet = $termStore.GetTermSet($taxfield.TermSetId)
[String]$fromval = $item[$fromfield]
$vals = $fromval -split ';#' | where {![String]::IsNullOrEmpty($_)}
if ($null -ne $vals -and $vals.Count -ge 0 ) {
$val = $vals[0]
if ($vals.Count -gt 1) {
write-Warning "$($item.Url) Found more than one value in $($fromfield.InternalName)"
continue
$terms =$termSet.GetTerms($val,$true)
if ($null -ne $terms -and $terms.Count -gt 0) {
$term = $terms[0]
$taxfield.SetFieldValue($item,$term)
Write-Verbose "$($tofield.InternalName) = $($term.Name)"
} else {
Write-Warning "Could not determine term for $($fromfield.InternalName) for $($item.Url)"
continue
} elseif ($tofield.TypeAsString -eq "TaxonomyFieldTypeMulti" -and
$fromfield.TypeAsString -notlike "TaxonomyFieldType*" ) {
Write-Debug "we are here: $($item.Name): $($fromfield.TypeAsString) to $($tofield.TypeAsString )"
#non taxonomy to taxonomy
$taxfield =[Microsoft.SharePoint.Taxonomy.TaxonomyField]$tofield
$termSet = $termStore.GetTermSet($taxfield.TermSetId)
$taxfieldValues = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValueCollection $tofield
[String]$fromval = $item[$fromfield]
$vals = $fromval -split ';#' | where {![String]::IsNullOrEmpty($_)}
foreach ($val in $vals){
$terms =$termSet.GetTerms($val,$true)
if ($null -ne $terms -and $terms.Count -gt 0) {
$term=$terms[0]
$taxfieldValue = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($tofield)
$taxfieldValue.TermGuid = $term.Id.ToString()
$taxfieldValue.Label = $term.Name
$taxfieldValues.Add($taxfieldValue)
} else {
Write-Warning "Could not determine term for $($fromfield.InternalName) for $($item.Url)"
continue
#,[Microsoft.SharePoint.Taxonomy.StringMatchOption]::ExactMatch,
$taxfield.SetFieldValue($item,$taxfieldValues)
$valsAsString = $taxfieldValues | Out-String
Write-Debug "$($tofield.InternalName) = $valsAsString"
} elseif ($tofield.TypeAsString -eq "TaxonomyFieldTypeMulti" -and
$fromfield.TypeAsString -eq "TaxonomyFieldType" ) {
#single taxonomy to multi
$taxfieldValues = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValueCollection $tofield
$taxfield =[Microsoft.SharePoint.Taxonomy.TaxonomyField]$tofield
$taxfieldValues.Add($item[$fromfield])
$taxfield.SetFieldValue($item,$taxFieldValues)
Write-Verbose "$($tofield.InternalName) = $valsAsString"
} elseif ($tofield.TypeAsString -eq "TaxonomyFieldType" -and
$fromfield.TypeAsString -eq "TaxonomyFieldTypeMulti" ) {
#multi taxonomy to single taxonomy
Write-Warning "multi to non multi - what to do here"
continue
} elseif ($tofield.TypeAsString -eq "Lookup" -and
$fromfield.TypeAsString -ne "Lookup" ) {
#non lookup to lookup
Write-Warning "non lookup to lookup - still todo"
continue
} else {
#straight copy
$item[$tofield] = $item[$fromfield]
$item.SystemUpdate($false)
$folders = $folder.SubFolders | where name -ne "Forms"
$folders | Set-SPListItemValuesToDefaults -ParentFolderDefaults $FolderDefaults -fieldsToCopy $fieldsToCopy -fieldsToUpdate $fieldsToUpdate -overwrite:$overwrite -overwriteFromFields:$overwriteFromFields -termStore $termStore -
Hi there,
XSLT List View displaying all the list items within the Document Library. In order to implement the Search functionality within Document library out of box "Text Filter" web part is configured as explained below. The solution is similar to
the one suggested at
http://www.wonderlaura.com/Lists/Posts/Post.aspx?ID=77
"Text Filter" Web Part added to the page.
Filter Criteria (i.e., XSLT List View columns) added to the XSLT List View where the filter parameters take the input from the "Text Filter" Web Part .
3. Both Web Parts (XSLT List View and the Text Filter) are connected.
When the search criteria is entered into the "Text Filter" Web Part, it is passed to the relevant Columns of the XSLT List View and the documents (List Items) that match the search criteria are shown within XSLT List View.
Search functionality working as expected.
Query: Selecting the "Export to Excel" icon from the ribbon generates the excel spread sheet with no data except Column Titles from the Document library. In the investigation it is
found that adding the 'Filter Criteria' on XSLT List View is causing this bug. When the Filter Criteria is removed, then "Export to Excel" functionality is working as expected.
But it is mandatory to add "Filter Criteria" to implement the search functionality with in the document library.
Help: Help/input appreciated on the work around to get the "Export to Excel" functionality work when the "Filter Criteria"
exist on the XSLT List View.
Regards,Once again thanks very much for your help Scott. very much appreciated.
In the investigation it is found that removing the 'Filter Criteria' on XSLT List View makes the "Export to Excel" functionality work. But the 'Filter Criteria' is mandatory to get the 'Document Search' functionality.
I think that due to technical limitations it should be concluded that 'only custom development can make all work, no code solutions using the SharePoint Designer can meet all the requirements.
If you can think of any alternative solution that could help in resolving the current issue or fix the issue without any custom implementation please inform. Otherwise this issue would be marked as resolved with your suggested response.
Regards, -
Gather_table_stats with a method opt of "for all indexed columns size 0"
I have 9 databases I support that contain the same structure, and very similar data concentrations. We are seeing inconsistent performance in the different databases due to bind variable peeking.. I have tracked it down to the Min and Max values that are gathered during the analyze. I analyze from one cluster, and export/import those statistics into the other clusters.. I then go about locking down the stats gathered. Some of the statistics are on tables that contain transient data (the older data is purged, and new data gets a new PK sequence number).
Since I am gathering statistics with a 'FOR ALL INDEXED COLUMNS SIZE 1', a min and max value are grabbed. This value is only appropriate for a short period of time, and only for a specific database. I do want oracle to know the density to help calculate, but I don't want cardinality based on whether the current bind values fall in this range..
Example
COLUMN PK
When I analyze the min is 1 and max is 5. I then let the database run, and the new min is 100 and max is 105.. same number of rows, but different min/max. At first a select * from table where pk>=1 and pk <=5 would return a cardinality of 5.. Later, a seelct * from tables where pk>=100 and pk<=105 would return a cardinaility 1.
Any ideas how to avoid this other than trying set min and max to something myself (like min =1 max = 99999999). ??MarkDPowell wrote:
The Oracle documentation on bind variable peeking said it did not peek without histograms and I cannot remember ever seeing on 9.2 where the trace showed otherwise. Mark,
see this simple test case run on 9.2.0.8. No histograms, but bind variable peeking, as you can see that the EXPLAIN PLAN output generated by AUTOTRACE differs from the estimated cardinality of the actual plan used at runtime.
Which documentation do you refer to?
SQL>
SQL> alter session set nls_language = 'AMERICAN';
Session altered.
SQL>
SQL> drop table bind_peek_test;
Table dropped.
SQL>
SQL> create table bind_peek_test
2 as
3 select
4 100 as n1
5 , cast(dbms_random.string('a', 20) as varchar2(20)) as filler
6 from
7 dual
8 connect by
9 level <= 1000;
Table created.
SQL>
SQL> exec dbms_stats.gather_table_stats(null, 'bind_peek_test', method_opt=>'FOR ALL COLUMNS SIZE 1')
PL/SQL procedure successfully completed.
SQL>
SQL> variable n number
SQL>
SQL> variable n2 number
SQL>
SQL> alter system flush shared_pool;
System altered.
SQL>
SQL> exec :n := 1; :n2 := 50;
PL/SQL procedure successfully completed.
SQL>
SQL> set autotrace traceonly
SQL>
SQL> select * from bind_peek_test where n1 >= :n and n1 <= :n2;
no rows selected
Execution Plan
0 SELECT STATEMENT Optimizer=CHOOSE (Cost=2 Card=1000 Bytes=24
000)
1 0 FILTER
2 1 TABLE ACCESS (FULL) OF 'BIND_PEEK_TEST' (Cost=2 Card=100
0 Bytes=24000)
Statistics
236 recursive calls
0 db block gets
35 consistent gets
0 physical reads
0 redo size
299 bytes sent via SQL*Net to client
372 bytes received via SQL*Net from client
1 SQL*Net roundtrips to/from client
4 sorts (memory)
0 sorts (disk)
0 rows processed
SQL>
SQL> set autotrace off
SQL>
SQL> select
2 cardinality
3 from
4 v$sql_plan
5 where
6 cardinality is not null
7 and hash_value in (
8 select
9 hash_value
10 from
11 v$sql
12 where
13 sql_text like 'select * from bind_peek_test%'
14 );
CARDINALITY
1
SQL>
SQL> alter system flush shared_pool;
System altered.
SQL>
SQL> exec :n := 100; :n2 := 100;
PL/SQL procedure successfully completed.
SQL>
SQL> set autotrace traceonly
SQL>
SQL> select * from bind_peek_test where n1 >= :n and n1 <= :n2;
1000 rows selected.
Execution Plan
0 SELECT STATEMENT Optimizer=CHOOSE (Cost=2 Card=1000 Bytes=24
000)
1 0 FILTER
2 1 TABLE ACCESS (FULL) OF 'BIND_PEEK_TEST' (Cost=2 Card=100
0 Bytes=24000)
Statistics
236 recursive calls
0 db block gets
102 consistent gets
0 physical reads
0 redo size
34435 bytes sent via SQL*Net to client
1109 bytes received via SQL*Net from client
68 SQL*Net roundtrips to/from client
4 sorts (memory)
0 sorts (disk)
1000 rows processed
SQL>
SQL> set autotrace off
SQL>
SQL> select
2 cardinality
3 from
4 v$sql_plan
5 where
6 cardinality is not null
7 and hash_value = (
8 select
9 hash_value
10 from
11 v$sql
12 where
13 sql_text like 'select * from bind_peek_test%'
14 );
CARDINALITY
1000
SQL>
SQL> spool offRegards,
Randolf
Oracle related stuff blog:
http://oracle-randolf.blogspot.com/
SQLTools++ for Oracle (Open source Oracle GUI for Windows):
http://www.sqltools-plusplus.org:7676/
http://sourceforge.net/projects/sqlt-pp/ -
SetRepo command to view the current Index repository location
Hi,
I am using the following command to view the current location of the index repository.
./lh -Djava.ext.dirs=/usr/WebSphere/AppServer/java/jre/lib/ext:/usr/WebSphere/AppServer/lib setRepo -icom.ibm.websphere.naming.WsnInitialContextFactory -c
However I get the following error
Feb 11, 2009 5:08:28 PM com.ibm.ejs.j2c.ConnectionFactoryBuilderImpl
SEVERE: SET_METHOD_EXCP_J2CA0036
Feb 11, 2009 5:08:29 PM com.ibm.ejs.j2c.ConnectionFactoryBuilderImpl
SEVERE: CREATE_MANAGED_CONNECTION_FACTORY_DETAILS_EXCP_J2CA0009
Feb 11, 2009 5:08:29 PM com.ibm.ws.naming.util.Helpers
WARNING: jndiGetObjInstErr
Feb 11, 2009 5:08:29 PM com.ibm.ws.naming.util.Helpers
WARNING: jndiNamingException
com.ibm.websphere.naming.CannotInstantiateObjectException: Exception occurred while the JNDI NamingManager was processing a javax.naming.Reference object. [Root exception is java.lang.reflect.InvocationTargetException]
Cannot reach current repository location:com.waveset.util.ConfigurationError: Failed to load JDBC DataSource 'jdbc/eumdevrepo2':
==> com.ibm.websphere.naming.CannotInstantiateObjectException: Exception occurred while the JNDI NamingManager was processing a javax.naming.Reference object
From my appserver admin console, I am able to test my datasource and it works fine without any issues. any ideas as to what could be the problem?
thanksHi ,
No, there is not a shortcut keyboard command to insert the current date into a Date cell in Datasheet view.
For your issue, you can set the default value of the Date column to “Today’s Date”:
Go to your list -> List Settings -> select the Date column
under Columns section.
Select “Today’s Date” to default value.
Best Regards,
Eric
Eric Tao
TechNet Community Support -
How to add a default value in a site column for every item in a document library
HI
i created a content type with some site columns ,
and included in a Document library.
Process ( content type)
-ProcessNo
-ProcessName
after that i uploaded 100 Documents but not added value in a site column process name.
now how i add a default value in a site column for every document in a document library
adilHI
i get below error when i change the script
PS C:\scripts> C:\Scripts\updatedefaultvalue.ps1
Cannot index into a null array.
At C:\Scripts\updatedefaultvalue.ps1:8 char:7
+ IF($i[ <<<< "Title"] -eq $null)
+ CategoryInfo : InvalidOperation: (Title:String) [], RuntimeExce
ption
+ FullyQualifiedErrorId : NullArray
$web = Get-SPWeb http://tspmcwfe:89/
$list = $web.Lists["test"]
$item = $list.Items | Where { $_["Name"] -eq "Emc" }
foreach($i in $items)
IF($i["Title"] -eq $null)
$i["Title"] = "test"
$i.Update()
adil
Why are you piping a where in the items? Do you only want to add the "test" to ones matching
a name?
If you have ISE installed on your server I recommend you put your code in there and debug it.
If this is helpful please mark it so. Also if this solved your problem mark as answer. -
Filter Criteria in Value Help for InfoObject
Hi,
Is it possible to remove the value from Filter Criteria. When I use F4 for a Field (in Screen) , its showing a poup window (Value Help for Info Object) with some personal values. When I click "More Values" option its showing the General Value List, where I can see the Filter Criteria. When I click "Show Filter Criteria", its showing the Criteria with a Value (This value is coming from the field(Screen) from where I am coming). is it possible to avoid of passing value from my screen field to Filter Criteria ?
Thanks
SrinivasMurugan,
I really hope that the moderator do not remove my points for this!
WebDynpro ABAP Select-Options and OVS Help.
How to Achieve OVS Search Help in Select Option in Web Dynpro ABAP
How to Use OVS Help For Multiple Input Fields in Select-Options.
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/707f0d73-94f0-2d10-97a2-a3776e2118d8?QuickLink=index&…
Using Select Options in a Web Dynpro(ABAP) Application
Regards,
Ashvin -
Clear the Filter Criteria from java code programmatically
Hi All,
I am using jdev version 11.1.1.6.0.
I do have ADF table for which I have added filter to each column .
I created table using java class data control.
Filter is working Fine .
My use case is-
When I click on search button data is populated in table.
When anybody enters filter value in column suppose product and hit enter ,it filters data.
if he clears and do not hit enter key and search again then it does not show all data it only show filtered data.
So how can I programmatically clear all filters so on click of search it will show all the values not filtered values.
I have not used default Filter Behavior.
Please check below code for reference
<af:table value="#{bindings.AfMyAccOrderStatusHistorySearchVO.rangeSet}"
var="row"
rows="#{bindings.AfMyAccOrderStatusHistorySearchVO.rangeSize}"
emptyText="#{bindings.AfMyAccOrderStatusHistorySearchVO.viewable ? 'No data to display.' : 'Access Denied.'}"
fetchSize="#{bindings.AfMyAccOrderStatusHistorySearchVO.rangeSize}"
rowBandingInterval="0" id="tblStatusHistoryList"
autoHeightRows="#{bindings.AfMyAccOrderStatusHistorySearchVO.rangeSize}"
rowSelection="single"
width="100%"
partialTriggers="::cb5 ::cb8 ::cb1 ::cb2"
filterModel="#{bindings.AfMyAccOrderStatusHistorySearchVO1Query.queryDescriptor}"
queryListener="#{bindings.AfMyAccOrderStatusHistorySearchVO1Query.processQuery}"
filterVisible="true" varStatus="vs"
binding="#{AfMyAccOrderStatusHistoryAction.orderStatusHistorySearchList}">
<af:column headerText="#{alfaprojectBundle['ordstatushistory.column.invoiceDate']}"
width="70"
sortProperty="invoiceDate"
sortable="true" filterable="true"
id="c7" filterFeatures="caseInsensitive">
<af:outputText value="#{row.invoiceDate}" id="ot16"/>
</af:column>
<af:column headerText="#{alfaprojectBundle['ordstatushistory.column.soldto']}"
width="100"
sortProperty="soldTo"
sortable="true" filterable="true"
id="c14" filterFeatures="caseInsensitive">
<af:outputText value="#{row.soldTo}"
visible="#{row.visibilityIsOrdrFirstItem}"
id="ot23"/>
</af:column>
So how to clear all filter values from java code.I can't get the example "Programmatically Manipulating a Table's QBE Filter Fields"
Where is it ?
https://smuenchadf.samplecode.oracle.com/samples/ClearTableColumnFilterFields.zip
Thks -
How can I get (using API) the current sort column for some report
hello,
How can I get (using API) the current sort column for some report ? For example something like "fsp_sort_1_desc" - if the user sorts by the first column ?
I cannot use the :REQUEST for this, sometimes the current sort column is not in the :REQUEST, but it is still active.
I thought it was posssible by using
APEX_UTIL.GET_PREFERENCE (
p_preference IN VARCHAR2 DEFAULT NULL,
p_user IN VARCHAR2 DEFAULT V('USER'))
RETURN VARCHAR2;
function, but I don't really know which preference should I pass as parameter.
looking in WWV_FLOW_PREFERENCES$, i saw preferences_names like FSP4000_P527_R6281510839654570_SORT , I'm not sure how this name is formed.
I'm using generic columns for that complex report (which has a flexible number of columns shown), and the idea is that sometimes I have to overwrite that sort column, in case the user chose the version of the report with fewer columns than the previous one.
Can I get (using API) a list of all preferences set for some user ?
Thank you,seems that it is FSP<app_number>P<pagenumber>R<regionnumber>_SORT.
is there anyplace where I can get these kind of things documented ?
Thank you. -
Default selection on current month, week and date
Hi,
We are on Dashboard 4.1 sp3 the same version as BI.
The dashboard report is using LiveOffice connection. We are now facing an issue with default selection on current month, week and date.
The dashboard report drilled down from month, to week, then to date. The dashboard feed -live office report is on month/week/date ascending order - becuase we have running average calcualtion on LO report, it seems have to be in ascending order to get the correnct runnning average.
I tried to on Insertion to change default seletion, it works on month, but it doesnt working on week and daily.
but when LO report is on ascending order, on dashboard column chart the default selection is not on the current month, week and date.
Is there a way to solve the issue. Could anyone please help.
Thanks,Hi Suman,
Thanks for the quick reply.
Do you mean I Enable sorting -by categrory labels on Behaviour - common tab.
Thanks, -
Interactive report and default-filter
ahoj!
i have a question regarding interactive reports and default-filter for a date: is it possible to use the current date for the default-filter? apex need 'dd.MM.yyyy' for the filter... already tried to_date(sysdate, 'dd.MM.yyyy')
thx in advance,
christianHi,
I've just been trying the "in the last" option and had no problems for any number that I entered. Are you just entering 1 into the box? What error do you get?
I've loaded the page with Debug switched on, and get:
select
null as apxws_row_pk,
"DATE_ID",
"ATD_DATE",
"CHECK",
count(*) over () as apxws_row_cnt
from (
select * from (
select
apex_item.checkbox(1, DATE_ID) "CHECK",
"DATE_ID",
"ATD_DATE"
from "#OWNER#"."ATD_DATES"
) r
where ("ATD_DATE" between systimestamp - (1 * :APXWS_EXPR_1) and systimestamp)
) r where rownum <= to_number(:APXWS_MAX_ROW_CNT):APXWS_EXPR_1 would contain the value 1 as that is what I've entered for the filter. My report's sql statement is just the innermost nested select statement, the rest has been added by the IR functionality and the filter.
Andy -
Retrieving sort and filter criteria for interactive report
I have developed a test management system in APEX. Users log in and see an interactive report with their assignments. Each test they've been assigned occupies one row.
There is an "Execute" link for each row that takes the user to the test execution page, where they can see the input steps / expected results and report the test passing or failing.
I've implemented "Next >" and "< Previous" buttons on the execution page, so the user can immediately go forwards or backwards in his assigned tests without having to return to the assignments page and click the Execute link on the next or previous row.
The test execution page figures out what the next and previous assigned test is using the LAG and LEAD functions, like this:
lag(assignment_id) over (order by ref_num_full) prev_id,
lead(assignment_id) over (order by ref_num_full) next_id
Notice that I've "hardcoded" the over clause to be the reference number of the test.
The problem here - this solution ignores any custom sorts or filters the user has put in place on the assignments (interactive) report.
Is there a way I can "grab" the interactive report (1) sort criteria and (2) filter criteria dynamically? I'm thinking I could then use dynamic SQL to build an OVER clause used in the lag/lead calls so that the application now follows whatever sort and/or filter criteria the user has put in place.
Any help is greatly appreciated...thanks!!Thanks WTine!
I took a look and determined
- I can get the sort criteria from the APEX_APPLICATION_PAGE_IR_RPT view (SORT_COLUMN_# and SORT_DIRECTION_# columns)
- I can get the filter criteria from the APEX_APPLICATION_PAGE_IR_COND view
Regards, Rich -
Sum the value of look up table based on two filter criteria
Hello Everyone
I am new to Powerpivot and would appreciate if someone could help me on the following problem.
You can download the example of this excel file form the following DropBox link:
Dropbox Link
The first table is tOrders
Week number
Work center
order number
Production time in minutes
2
a
111
60
2
a
112
70
2
b
113
60
3
b
114
50
3
a
115
40
3
b
116
60
4
a
117
90
4
b
118
40
The second is dLookupList
Week number
Work center
mantenace in minutes per week
Break dows in minutes per week
2
a
10
10
2
b
20
5
3
a
15
12
3
b
30
10
4
a
20
10
4
b
10
10
I’m trying to create Pivot that has filter on Week number to show the number of orders, Sum of Production time in minutes and the total of the values form the lookup table dLookupList that matches the work center and the selected week
numbers. So that I can calculate the total time for each work center. Filter criteria is Week number and Work center.
For example if someone select all weeks numbers the result sould look like this
Week number
(All)
Work center
Count of order number
Sum of Production time in minutes
mantenace in minutes per week
Break dows in minutes per week
Total time
a
4
260
45
32
337
b
4
210
60
25
295
Grand Total
8
470
Result for week 2
Week number
2
Work center
Count of order number
Sum of Production time in minutes
mantenace in minutes per week
Break dows in minutes per week
Total time
a
2
130
10
10
150
b
1
60
20
5
85
Grand Total
3
190
How can I relate these two tables to get the above result?
Any help is highly appreciated.
Regards
PriyanHi Recio
Thank you very much for the swift response. I was able to get it work.
I got two questions:
How do you add a total time column to the pivot table like you did? Because there are no calculated field in power pivot.
I prefer that the filter is based on the Orders table. So that if you select all Week numbers in the filter, that pivot will show result for all orders and relevant sums from the lookup list.
Link download the example file
For example: I add Week number 5 to the work center “a”
Week number
Work center
WNandWC
mantenace in minutes per week
Break dows in minutes per week
2
a
WN2WCa
10
10
2
b
WN2WCb
20
5
3
a
WN3WCa
15
12
3
b
WN3WCb
30
10
4
a
WN4WCa
20
10
4
b
WN4WCb
10
10
5
a
WN5WCa
1
1
In the orders table there are no records for week number 5
Week number
Work center
WNandWC
order number
Production time in minutes
2
a
WN2WCa
111
60
2
a
WN2WCa
112
70
2
b
WN2WCb
113
60
3
b
WN3WCb
114
50
3
a
WN3WCa
115
40
3
b
WN3WCb
116
60
4
a
WN4WCa
117
90
4
b
WN4WCb
118
40
4
a
WN4WCa
119
50
But the pivot sums up the week number 5 also.
Do you have any idea how to solve it?
Thank you very much.
Regards
Priyan -
Interactive Reports - Filter Criteria Using Other Operators
Currently using version 3.1.2.
Is there any way to specify the operators that are used for filter criteria on an interactive report?
Trying to specify the following criteria...
LASTNAME like 'B%'
OR
LASTNAME like 'A%'
The interactive report uses an "AND" in this case instead of an "OR".
Any ideas?
Thanks,
KrisAny comments would be very much appreciated.
Many thanks
Maybe you are looking for
-
Problem in converting english into spanish
In one of our POC i need to create a custom workflow and calling it from HTML page. I did that by making a self registration workflow and put it under "..user/anonlogin.jsp" and then by some minimal customization i called it using link referrence "ht
-
How to set a particular node selected in a JTree from within the model
I have an adapter class that provides communication between my JTree and data model. My adapter class implements TreeModel, TreeExpansionListener, TreeSelectionListener and a listener for changes to my data model. My data model keeps reference to the
-
I want to reuse a cutout of a face that I am making talk
I have very successfully gotten a face to talk using Roberts Productions training video. I have successfully merged a cutout of the face that I made using the pen tool in a video clip and gotten it to move with the body in the clip. I have done this
-
Hi All, Just got back from an awsome holiday in Cornwall where I took loads of fantastic photos to find that when I copied the photos off the phone onto my PC a large portion, 20-30 or so couldn't be opened by any software on my pc. Looks likemy z2
-
Setting sga_max_size
what is the maximum value upto which we may increase the sga_max_size. I have already increased it to 2.8 GB. Oracle 10gR2 (64 bit) RHEL3 physical RAM 5GB