IncidentStatusDurationFactvw view missing data
I wanted to create an acurate measure of time spent on incidents, but to do that I have to subtract the hours an incident spends in status "Pending Supplier" or "Pending User".
I found the view IncidentStatusDurationFactvw which is great for this purpose, however I noticed a lot of missing data...
FinishDateTime is NULL on many statuses, even though the incident has changed status long ago.
(All DW jobs are working fine.)
Hi Andreas,
I didn't apply the note you mentionned because it was already done on my system.
But I've found another one that was related to "your" note and it fix my issue.
The note is 1115435.
Thanks for your help.
Best regards,
Pierre
Similar Messages
-
ALV Excel view missing data on last row
Hi all,
I'm having a bug using ALV report.
I've created a report using "REUSE_ALV_GRID_DISPLAY" FM with the standard user command "USER_COMMAND".
In this menu, there is the bouton "Microsoft Excel" (CtrlShiftF7) that allows to switch the ALV to an excel view.
My report has 21 columns.
If I'm displaying 38 rows, the excel view is fine.
But if I'm displaying 39 rows, the last 5 cells are empty...
Does somebody has any idea of what could be the reason ?
Thanks for your help.
PierreHi Andreas,
I didn't apply the note you mentionned because it was already done on my system.
But I've found another one that was related to "your" note and it fix my issue.
The note is 1115435.
Thanks for your help.
Best regards,
Pierre -
SMS Transferring/downloading/backup : Is there a way to download/port the missing data/SMS to one iPhone without overwriting the new SMS on the newer iPhone? Here is my dilemma. My iPhone quit working and I couldn’t get it fixed as quickly as I needed to. I bought a used iPhone and started using it normally and downloaded the backup from itunes/icloud. There was a large gap in data where the first iPhone didn’t backup when I thought it had. I have SMS data on both phonesnow and I need them for a court hearing. I have since had the other iPhone repaired and working perfectly. Will downloading a backup wipe out the newer SMS data? I am nervous because I can’t lose the information or my case mey be compromised. Any “Real Help” including programs that work to accomplish this would be very helpful. Thank You in advance…
No, direct access to manipulating the phone's filesystem is not permitted for 3rd party apps in the iOS. It can't be done without hacking the phone, which you certainly don't want to do. The messages can be combined and viewed on your computer, but not on a single phone.
-
[SOLVED] Can't view Monitorix data
I installed Monitorix on my home server and want it to work together with an Apache HTTP server. I think I configured it correctly, and I can reach Monitorix's home page, but when I click on 'OK' to view the data, instead what I get is what I assume to be a perl script:
#!/usr/bin/env perl
# Monitorix - A lightweight system monitoring tool.
# Copyright (C) 2005-2015 by Jordi Sanfeliu <[email protected]>
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
use strict;
use warnings;
use FindBin qw($Bin);
use lib $Bin . "/lib", "/usr/lib/monitorix";
use Monitorix;
use CGI qw(:standard);
use Config::General;
use POSIX;
use RRDs;
my %config;
my %cgi;
my %colors;
my %tf;
my @version12;
my @version12_small;
sub multihost {
my ($config, $colors, $cgi) = @_;
my $n;
my $n2;
my @host;
my @url;
my @foot_url;
my $multihost = $config->{multihost};
if($cgi->{val} =~ m/group(\d*)/) {
my @remotegroup_desc;
# all groups
if($cgi->{val} eq "group") {
my @remotegroup_list = split(',', $multihost->{remotegroup_list});
for($n = 0; $n < scalar(@remotegroup_list); $n++) {
scalar(my @tmp = split(',', $multihost->{remotegroup_desc}->{$n}));
for($n2 = 0; $n2 < scalar(@tmp); $n2++) {
push(@remotegroup_desc, trim($tmp[$n2]));
# specific group
if($cgi->{val} =~ m/group(\d+)/) {
my $gnum = int($1);
@remotegroup_desc = split(',', $multihost->{remotegroup_desc}->{$gnum});
my @remotehost_list = split(',', $multihost->{remotehost_list});
for($n = 0; $n < scalar(@remotegroup_desc); $n++) {
my $h = trim($remotegroup_desc[$n]);
for($n2 = 0; $n2 < scalar(@remotehost_list); $n2++) {
my $h2 = trim($remotehost_list[$n2]);
if($h eq $h2) {
push(@host, $h);
push(@url, (split(',', $multihost->{remotehost_desc}->{$n2}))[0] . (split(',', $multihost->{remotehost_desc}->{$n2}))[2]);
push(@foot_url, (split(',', $multihost->{remotehost_desc}->{$n2}))[0] . (split(',', $multihost->{remotehost_desc}->{$n2}))[1]);
} else {
my @remotehost_list = split(',', $multihost->{remotehost_list});
for($n = 0; $n < scalar(@remotehost_list); $n++) {
push(@host, trim($remotehost_list[$n]));
push(@url, (split(',', $multihost->{remotehost_desc}->{$n}))[0] . (split(',', $multihost->{remotehost_desc}->{$n}))[2]);
push(@foot_url, (split(',', $multihost->{remotehost_desc}->{$n}))[0] . (split(',', $multihost->{remotehost_desc}->{$n}))[1]);
$multihost->{graphs_per_row} = 1 unless $multihost->{graphs_per_row} > 1;
my $graph = ($cgi->{graph} eq "all" || $cgi->{graph} =~ m/group\[0-9]*/) ? "_system1" : $cgi->{graph};
if($cgi->{val} eq "all" || $cgi->{val} =~ m/group[0-9]*/) {
for($n = 0; $n < scalar(@host); $n += $multihost->{graphs_per_row}) {
print "<table cellspacing='5' cellpadding='0' width='1' bgcolor='$colors->{graph_bg_color}' border='1'>\n";
print " <tr>\n";
for($n2 = 0; $n2 < $multihost->{graphs_per_row}; $n2++) {
if($n < scalar(@host)) {
print " <td bgcolor='$colors->{title_bg_color}'>\n";
print " <font face='Verdana, sans-serif' color='$colors->{fg_color}'>\n";
print " <b> " . $host[$n] . "</b>\n";
print " </font>\n";
print " </td>\n";
$n++;
print " </tr>\n";
print " <tr>\n";
for($n2 = 0, $n = $n - $multihost->{graphs_per_row}; $n2 < $multihost->{graphs_per_row}; $n2++) {
if($n < scalar(@host)) {
print " <td bgcolor='$colors->{title_bg_color}' style='vertical-align: top; height: 10%; width: 10%;'>\n";
print " <iframe src='" . $url[$n] . "/monitorix.cgi?mode=localhost&when=$cgi->{when}&graph=$graph&color=$cgi->{color}&silent=imagetag' height=201 width=397 frameborder=0 marginwidth=0 marginheight=0 scrolling=no></iframe>\n";
print " </td>\n";
$n++;
print " </tr>\n";
print " <tr>\n";
for($n2 = 0, $n = $n - $multihost->{graphs_per_row}; $n2 < $multihost->{graphs_per_row}; $n2++) {
if($n < scalar(@host)) {
if(lc($multihost->{footer_url}) eq "y") {
print " <td bgcolor='$colors->{title_bg_color}'>\n";
print " <font face='Verdana, sans-serif' color='$colors->{title_fg_color}'>\n";
print " <font size='-1'>\n";
print " <b> <a href='" . $foot_url[$n] . "' style='color: " . $colors->{title_fg_color} . ";'>$foot_url[$n]</a></b>\n";
print " </font></font>\n";
print " </td>\n";
$n++;
$n = $n - $multihost->{graphs_per_row};
print " </tr>\n";
print "</table>\n";
print "<br>\n";
} else {
print " <table cellspacing='5' cellpadding='0' width='1' bgcolor='$colors->{graph_bg_color}' border='1'>\n";
print " <tr>\n";
print " <td bgcolor='$colors->{title_bg_color}'>\n";
print " <font face='Verdana, sans-serif' color='$colors->{fg_color}'>\n";
print " <b> " . $host[$cgi->{val}] . "</b>\n";
print " </font>\n";
print " </td>\n";
print " </tr>\n";
print " <tr>\n";
print " <td bgcolor='$colors->{title_bg_color}' style='vertical-align: top; height: 10%; width: 10%;'>\n";
print " <iframe src='" . (split(',', $multihost->{remotehost_desc}->{$cgi->{val}}))[0] . (split(',', $multihost->{remotehost_desc}->{$cgi->{val}}))[2] . "/monitorix.cgi?mode=localhost&when=$cgi->{when}&graph=$graph&color=$cgi->{color}&silent=imagetagbig' height=249 width=545 frameborder=0 marginwidth=0 marginheight=0 scrolling=no></iframe>\n";
print " </td>\n";
print " </tr>\n";
print " <tr>\n";
if(lc($multihost->{footer_url}) eq "y") {
print " <td bgcolor='$colors->{title_bg_color}'>\n";
print " <font face='Verdana, sans-serif' color='$colors->{title_fg_color}'>\n";
print " <font size='-1'>\n";
print " <b> <a href='" . $foot_url[$cgi->{val}] . "' style='color: " . $colors->{title_fg_color} . ";'>$foot_url[$cgi->{val}]</a></b>\n";
print " </font></font>\n";
print " </td>\n";
print " </tr>\n";
print " </table>\n";
print " <br>\n";
sub graph_header {
my ($title, $colspan) = @_;
print("\n");
print("<!-- graph table begins -->\n");
print(" <table cellspacing='5' cellpadding='0' width='1' bgcolor='$colors{graph_bg_color}' border='1'>\n");
print(" <tr>\n");
print(" <td bgcolor='$colors{title_bg_color}' colspan='$colspan'>\n");
print(" <font face='Verdana, sans-serif' color='$colors{title_fg_color}'>\n");
print(" <b> $title</b>\n");
print(" </font>\n");
print(" </td>\n");
print(" </tr>\n");
sub graph_footer {
print(" </table>\n");
print("<!-- graph table ends -->\n");
# MAIN
open(IN, "< monitorix.conf.path");
my $config_path = <IN>;
chomp($config_path);
close(IN);
if(! -f $config_path) {
print(<< "EOF");
Content-Type: text/plain
<pre>
FATAL: Monitorix is unable to continue!
=======================================
File 'monitorix.conf.path' was not found.
Please make sure that 'base_dir' option is correctly configured and this
CGI (monitorix.cgi) is located in the 'base_dir'/cgi/ directory.
And don't forget to restart Monitorix for the changes to take effect!
EOF
die "FATAL: File 'monitorix.conf.path' was not found!";
# load main configuration file
my $conf = new Config::General(
-ConfigFile => $config_path,
%config = $conf->getall;
# load additional configuration files
if($config{include_dir} && opendir(DIR, $config{include_dir})) {
my @files = grep { !/^[.]/ } readdir(DIR);
closedir(DIR);
foreach my $c (sort @files) {
next unless -f $config{include_dir} . "/$c";
next unless $c =~ m/\.conf$/;
my $conf_inc = new Config::General(
-ConfigFile => $config{include_dir} . "/$c",
my %config_inc = $conf_inc->getall;
while(my ($key, $val) = each(%config_inc)) {
if(ref($val) eq "HASH") {
# two level options
while(my ($key2, $val2) = each(%{$val})) {
if(ref($val2) eq "HASH") {
# three level options
while(my ($key3, $val3) = each(%{$val2})) {
$config{$key}->{$key2}->{$key3} = $val3;
delete $config_inc{$key}->{$key2}->{$key3};
next;
$config{$key}->{$key2} = $val2;
delete $config_inc{$key}->{$key2};
next;
# graph_name option is special
if($key eq "graph_name") {
$config{graph_name} .= ", $val";
delete $config_inc{graph_name};
next;
# one level options
$config{$key} = $val;
delete $config_inc{$key};
$config{url} = ($config{url_prefix_proxy} || "");
if(!$config{url}) {
$config{url} = ($ENV{HTTPS} || ($config{httpd_builtin}->{https_url} || "n") eq "y") ? "https://" . $ENV{HTTP_HOST} : "http://" . $ENV{HTTP_HOST};
$config{hostname} = $config{hostname} || $ENV{SERVER_NAME};
if(!($config{hostname})) { # called from the command line
$config{hostname} = "127.0.0.1";
$config{url} = "http://127.0.0.1";
$config{url} .= $config{base_url};
our $mode = defined(param('mode')) ? param('mode') : '';
our $graph = param('graph');
our $when = param('when');
our $color = param('color');
our $val = defined(param('val')) ? param('val') : '';
our $silent = defined(param('silent')) ? param('silent') : '';
if($mode ne "localhost") {
($mode, $val) = split(/\./, $mode);
if(lc($config{httpd_builtin}->{enabled}) ne "y") {
print("Content-Type: text/html\n");
print("\n");
# get the current OS and kernel version
my $release;
($config{os}, undef, $release) = uname();
if(!($release =~ m/^(\d+)\.(\d+)/)) {
die "FATAL: unable to get the kernel version.";
$config{kernel} = "$1.$2";
$colors{graph_colors} = ();
$colors{warning_color} = "--color=CANVAS#880000";
# keep backwards compatibility for v3.2.1 and less
if(ref($config{theme}) ne "HASH") {
delete($config{theme});
if(!$config{theme}->{$color}) {
$color = "white";
$config{theme}->{$color}->{main_bg} = "FFFFFF";
$config{theme}->{$color}->{main_fg} = "000000";
$config{theme}->{$color}->{title_bg} = "777777";
$config{theme}->{$color}->{title_fg} = "CCCC00";
$config{theme}->{$color}->{graph_bg} = "CCCCCC";
$config{theme}->{$color}->{gap} = "000000";
if($color eq "black") {
push(@{$colors{graph_colors}}, "--color=CANVAS#" . $config{theme}->{$color}->{canvas});
push(@{$colors{graph_colors}}, "--color=BACK#" . $config{theme}->{$color}->{back});
push(@{$colors{graph_colors}}, "--color=FONT#" . $config{theme}->{$color}->{font});
push(@{$colors{graph_colors}}, "--color=MGRID#" . $config{theme}->{$color}->{mgrid});
push(@{$colors{graph_colors}}, "--color=GRID#" . $config{theme}->{$color}->{grid});
push(@{$colors{graph_colors}}, "--color=FRAME#" . $config{theme}->{$color}->{frame});
push(@{$colors{graph_colors}}, "--color=ARROW#" . $config{theme}->{$color}->{arrow});
push(@{$colors{graph_colors}}, "--color=SHADEA#" . $config{theme}->{$color}->{shadea});
push(@{$colors{graph_colors}}, "--color=SHADEB#" . $config{theme}->{$color}->{shadeb});
push(@{$colors{graph_colors}}, "--color=AXIS#" . $config{theme}->{$color}->{axis})
if defined($config{theme}->{$color}->{axis});
$colors{bg_color} = $config{theme}->{$color}->{main_bg};
$colors{fg_color} = $config{theme}->{$color}->{main_fg};
$colors{title_bg_color} = $config{theme}->{$color}->{title_bg};
$colors{title_fg_color} = $config{theme}->{$color}->{title_fg};
$colors{graph_bg_color} = $config{theme}->{$color}->{graph_bg};
$colors{gap} = $config{theme}->{$color}->{gap};
($tf{twhen}) = ($when =~ m/(hour|day|week|month|year)$/);
($tf{nwhen} = $when) =~ s/$tf{twhen}// unless !$tf{twhen};
$tf{nwhen} = 1 unless $tf{nwhen};
$tf{twhen} = "day" unless $tf{twhen};
$tf{when} = $tf{nwhen} . $tf{twhen};
# toggle this to 1 if you want to maintain old (2.3-) Monitorix with Multihost
if($config{backwards_compat_old_multihost}) {
$tf{when} = $tf{twhen};
our ($res, $tc, $tb, $ts);
if($tf{twhen} eq "day") {
($tf{res}, $tf{tc}, $tf{tb}, $tf{ts}) = (3600, 'h', 24, 1);
if($tf{twhen} eq "week") {
($tf{res}, $tf{tc}, $tf{tb}, $tf{ts}) = (108000, 'd', 7, 1);
if($tf{twhen} eq "month") {
($tf{res}, $tf{tc}, $tf{tb}, $tf{ts}) = (216000, 'd', 30, 1);
if($tf{twhen} eq "year") {
($tf{res}, $tf{tc}, $tf{tb}, $tf{ts}) = (5184000, 'd', 365, 1);
if($RRDs::VERSION > 1.2) {
push(@version12, "--slope-mode");
push(@version12, "--font=LEGEND:7:");
push(@version12, "--font=TITLE:9:");
push(@version12, "--font=UNIT:8:");
if($RRDs::VERSION >= 1.3) {
push(@version12, "--font=DEFAULT:0:Mono");
if($tf{twhen} eq "day") {
push(@version12, "--x-grid=HOUR:1:HOUR:6:HOUR:6:0:%R");
push(@version12_small, "--font=TITLE:8:");
push(@version12_small, "--font=UNIT:7:");
if($RRDs::VERSION >= 1.3) {
push(@version12_small, "--font=DEFAULT:0:Mono");
if(!$silent) {
my $title;
my $str;
my $piwik_code = "";
my ($piwik_url, $piwik_sid, $piwik_img);
# Piwik tracking code
if(lc($config{piwik_tracking}->{enabled}) eq "y") {
$piwik_url = $config{piwik_tracking}->{url} || "";
$piwik_sid = $config{piwik_tracking}->{sid} || "";
$piwik_img = $config{piwik_tracking}->{img} || "";
$piwik_code = <<"EOF";
<!-- Piwik -->
<script type="text/javascript">
var _paq = _paq || [];
_paq.push(['trackPageView']);
_paq.push(['enableLinkTracking']);
(function() {
var u=(("https:" == document.location.protocol) ? "https" : "http") + "$piwik_url";
_paq.push(['setTrackerUrl', u+'piwik.php']);
_paq.push(['setSiteId', $piwik_sid]);
var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0]; g.type='text/javascript';
g.defer=true; g.async=true; g.src=u+'piwik.js';
s.parentNode.insertBefore(g,s);
</script>
<noscript>
<p><img src="$piwik_img" style="border:0;" alt=""/></p>
</noscript>
<!-- End Piwik Code -->
EOF
print("<!DOCTYPE HTML PUBLIC '-//W3C//DTD HTML 3.2 Final//EN'>\n");
print("<html>\n");
print(" <head>\n");
print(" <title>$config{title}</title>\n");
print(" <link rel='shortcut icon' href='" . $config{url} . "/" . $config{favicon} . "'>\n");
if($config{refresh_rate}) {
print(" <meta http-equiv='Refresh' content='" . $config{refresh_rate} . "'>\n");
print(" </head>\n");
print(" <body bgcolor='" . $colors{bg_color} . "' vlink='#888888' link='#888888'>\n");
print(" $piwik_code\n");
print(" <center>\n");
print(" <table cellspacing='5' cellpadding='0' bgcolor='" . $colors{graph_bg_color} . "' border='1'>\n");
print(" <tr>\n");
if(($val ne "all" || $val ne "group") && $mode ne "multihost") {
print(" <td bgcolor='" . $colors{title_bg_color} . "'>\n");
print(" <font face='Verdana, sans-serif' color='" . $colors{title_fg_color} . "'>\n");
print(" <font size='5'><b> Host: </b></font>\n");
print(" </font>\n");
print(" </td>\n");
if($val =~ m/group(\d+)/) {
my $gnum = $1;
my $gname = (split(',', $config{multihost}->{remotegroup_list}))[$gnum];
$gname = trim($gname);
print(" <td bgcolor='" . $colors{title_bg_color} . "'>\n");
print(" <font face='Verdana, sans-serif' color='" . $colors{title_fg_color} . "'>\n");
print(" <font size='5'><b> $gname </b></font>\n");
print(" </font>\n");
print(" </td>\n");
print(" <td bgcolor='" . $colors{bg_color} . "'>\n");
print(" <font face='Verdana, sans-serif' color='" . $colors{fg_color} . "'>\n");
if($mode eq "localhost" || $mode eq "traffacct") {
$title = $config{hostname};
} elsif($mode eq "multihost") {
$graph = $graph eq "all" ? "_system1" : $graph;
my ($g1, $g2) = ($graph =~ /(_\D+).*?(\d)$/);
if($g1 eq "_port") {
$title = $config{graphs}->{$g1};
$g2 = trim((split(',', $config{port}->{list}))[$g2]);
$title .= " " . $g2;
$g2 = (split(',', $config{port}->{desc}->{$g2}))[0];
$title .= " (" . trim($g2) . ")";
} else {
$g2 = "" if $g1 eq "_proc"; # '_procn' must be converted to '_proc'
$title = $config{graphs}->{$g1 . $g2};
$title =~ s/ / /g;
print(" <font size='5'><b> $title </b></font>\n");
print(" </font>\n");
print(" </td>\n");
print(" <td bgcolor='" . $colors{title_bg_color} . "'>\n");
print(" <font face='Verdana, sans-serif' color='" . $colors{title_fg_color} . "'>\n");
print(" <font size='5'><b> last $tf{twhen} </b></font>\n");
print(" </font>\n");
print(" </td>\n");
print(" </tr>\n");
print(" </table>\n");
print(" <font face='Verdana, sans-serif' color='" . $colors{fg_color} . "'>\n");
print(" <h4><font color='#888888'>" . strftime("%a %b %e %H:%M:%S %Z %Y", localtime) . "</font></h4>\n");
$cgi{colors} = \%colors;
$cgi{tf} = \%tf;
$cgi{version12} = \@version12;
$cgi{version12_small} = \@version12_small;
$cgi{graph} = $graph;
$cgi{when} = $when;
$cgi{color} = $color;
$cgi{val} = $val;
$cgi{silent} = $silent;
if($mode eq "localhost") {
foreach (split(',', $config{graph_name})) {
my $gn = trim($_);
my $g = "";
if($graph ne "all") {
($g) = ($graph =~ m/^_*($gn)\d*$/);
next unless $g;
if(lc($config{graph_enable}->{$gn}) eq "y") {
my $cgi = $gn . "_cgi";
eval "use $gn qw(" . $cgi . ")";
if($@) {
print(STDERR "WARNING: unable to load module '$gn. $@'\n");
next;
if($graph eq "all" || $gn eq $g) {
no strict "refs";
&$cgi($gn, \%config, \%cgi);
} elsif($mode eq "multihost") {
multihost(\%config, \%colors, \%cgi);
} elsif($mode eq "traffacct") {
eval "use $mode qw(traffacct_cgi)";
if($@) {
print(STDERR "WARNING: unable to load module '$mode'. $@\n");
exit;
traffacct_cgi($mode, \%config, \%cgi);
if(!$silent) {
print("\n");
print(" </font>\n");
print(" </center>\n");
print("<!-- footer begins -->\n");
print(" <p>\n");
print(" <a href='http://www.monitorix.org'><img src='" . $config{url} . "/" . $config{logo_bottom} . "' border='0'></a>\n");
print(" <br>\n");
print(" <font face='Verdana, sans-serif' color='" . $colors{fg_color} . "' size='-2'>\n");
print("Copyright © 2005-2015 Jordi Sanfeliu\n");
print(" </font>\n");
print(" </body>\n");
print("</html>\n");
print("<!-- footer ends -->\n");
0;
Last edited by joaocandre (2015-03-15 20:45:19)karol wrote:Have you read https://wiki.archlinux.org/index.php/Mo … torix_Data ?
I did, anything I'm missing? I'm sure base_dir is correctly set up in the config file. Other than that, the wiki article doesn't have any information if there's any additional steps to set it up on an Apache server. -
Hyper-V Host Server 2012 R2 x64 drive c:
All VMs, including SBS 2011 Standard x64 are on drive d:
I replaced my motherboard and defined a Marvell RAID 1 Mirror for my boot drive C:. My drive D: containing all of my Hyper V machines remained untouched since the new MB had Intel Matrix Controller with RAID 5, as did the old MB.
Since the C: drives are a few bytes smaller after configured as a RAID 1, I was unable to restore drive C from the image copy, so I rebuilt a new Hyper-V Host identical as possible using Server 2012 R2 x64. I had to redefine the Hyper-V machines on
the host, and the 1st time I brought it all (host and VMs) I had to reconfigure the NICs due to new MAC addresses, and verifying SBS 2011 DNS and DHCP for all physical and Hyper-V machines. Otherwise my Hyper-V SBS 2011 x(64) wasn't touched.
Now the SBS Exchange 2010 is not receiving mail. In the Exchange Management Console clicking on the Organization config gives me a highlighted error "You don't have sufficient permissions to view this data". The Server
Config is no longer on the EMC. The Recipient Config looks fine.
Outlook 2013 x64 connects to the Exchange Server, but no new mail appeared. There were just a few older messages that came in since the last time I opened Outlook. My public DNS on GoDaddy was not changed. I did not change sending mail
through my ISP Sonic.net. The SBS internal DNS doesn't have an MX record, but I'm not sure it ever did since my GoDaddy public DNS has an MX record.
I tried restoring to an earlier Image Copy, but that made no difference. I'm using the Windows Backup from SBS 2011 for my daily image copy backups.
- Michael FaklisHi Michael,
à
After more research and running the Best Practices report from the EMC, I am missing a slew of ManagementRolesAndRoleGroups from the RBAC container.
Based on your description, it seems that you have find the cause of this issue. On current situation, I still
suggest that you should run
SBS 2011 BPA and then check if can find some relevant errors. Just for a confirmation.
Meanwhile, please refer to following articles and check if can help you.
Apply
missing Exchange 2010 RBAC Management Roles and Policies
RBAC Manager
(Exchange 2010 SP2)
Hope this helps.
Best regards,
Justin Gu -
Performance View with data from Custom Rule
Hello again everybody,
so we created a vbscript which measures time needed for a specific process inside a program.
Now we made a rule for getting this data inside SCOM. Alerting and Health-Monitoring works fine.
For example, we made "if TimeNeeded>5" critical Health state, works like a charm.
But now, we want to view the data (the script runs every 30 seconds) inside a Performance view.
- We checked if the rule works as intended. Check
- Set the rule to "Performance Collection". Check
- Set the right target group. Check
- Override for specific target. Check
- Created a perf.monitor "collected by specific rules" (added our rule) with right targeting. Check
But the performance-view keeps unpopulated.
What now?I exported the MP as XML file but the rule is not in there, only references.
But thats not the point anyway, the rule is working.
The only thing missing is to grab the data the script returns (via oAPI propertybag) and show it in a view. Thats all
Maybe the question was not clear enough:
Our script returns a value every 30 seconds, for example 6, then 4, then 8, then 10 and so on...
All we want now is to show these values in a Performance-View.
Graphical example:
10| X
08| X
06| X
04| X X
02| X
00|
I guess you now know what I mean.
This Article
corelan.be/index.php/2008/04/16/using-customnon-performance-counter-data-to-build-graphs-in-operations-manager-2007
is EXACTLY what I want and what I did.
But my Performance-View refuses to show
ANYTHING...
Whats the problem? -
Reg: Consuming missing data by JMS queue.
Hi,
System1 is configured to send XML messages though JMS queue to my WL-server. But some times i have observed missing of message consumption at my server end.
what may be the reason? How can i consume the missing XML messages? is there any place where missing data will be stored?
Please advise...!
Regards,
SkWhen I meant there is an expiry time set with the message. It means the message is received by your WLS, but you are not able to see it because the message has expired.
You can only view the messages that are not expired.
If you think you are not even receiving the message, may be you can enable debugging on the WLS JMS and see if you are able to get more details in the server logs.
Please refer - http://download.oracle.com/docs/cd/E12840_01/wls/docs103/jms_admin/troubleshoot.html#wp1128871
Thanks,
Patrick -
View V_TVCPAFK - Data not dispalyed while login into Finish ( FI ) language
Hi All,
I am executing transaction VTAF which in background calls view V_TVCPAFK.
When i am exeusting transaction VTAF using login language Finnish I am unable view any data where as if it is in english I am able to see data which is maintained in the view V_TVCPAFK.
Please suggest solution.
Regards,
Sachin.It seems during language import not all the required steps have been done.
For example Language Supplementation might be missing.
For a start:
http://help.sap.com/erp2005_ehp_04/helpdata/EN/bc/0ffd2138c64c768cc19fe519884b4c/content.htm
hope this helps -
I'm trying to print a poster I created and posted on my website, it was created in power point, saved as pdf, and uploaded using wordpress. It looks good online but is missing data when printed. It contains almost all the information except for about 6 lines, which happen to be important. Any thought are appreciated.
No its weird, everything was the same. I sort of resolved it, since he can view the PDF fine on his comupter but also wanted to print it, I just exported all the slides as JPEGS and made a new InDesign book with those on the pages and exported that as a PDF. Seemed to work for him, I just don't know what happened!
-
DFS behavior on Server 2008 R2 / Windows backups / missing data
Hello,
created a namespace, added a folder, configured full mesh replication between 4 members, everything seemed to work fine (data started replicating to all other members when added to any of them, etc.). Left them for initial replication over our slow link.
Tonight Windows backup process ran and backed up all the data. DFS process stopped while the backup was running (that's what event viewer shows). After it started again, all the data that wasn't already replicated to other member servers is missing from
the "primary" one...
Now whenever i restore that missing data to the original folder, dfs starts syncing and removes it again.. The only way to restart the whole thing is to disable replication to all folders, wipe them, wait for such changes to apply, etc.
The server i'm adding data to is Server 2008 R2, other DFS members are Server 2012 R2.
Why is this happening and how do i fix it?...
Thanks!Hi,
From the behavior, the initial replication is affected before finishing and now it consider the "not-copied data" is "pre-existing data" so it removed them from DFSR folder for avoiding confliction.
In current situation, you should pre-seeding the replication data instead of waiting DFSR for finishing it. Enable DFSR later once your pre-seeding job finished.
To do a pre-seeding, you can use Robocopy to copy files/folders with NTFS permissions to your other 3 servers.
Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Support, contact [email protected] -
Help needed with missing data problem in CRVS2010
We recently upgraded the reporting engine in our product to use Crystal Reports for Visual Studio 2010 (previously engine was CR9). Our quote report, which has numerous subreports and lots of conditional formatting, started losing data when a quote took more than a single page to be printed. We knew the SQL results included the data, but the report was not printing those lines at all or sometimes printing a partial line. In addition, the running total on the report would exclude the lines that were being missed on the next page. In one example submitted by a customer, 3 lines were skipped between pages.
I think I have identified two potential issues that document the possibility of data not being included in the report.
The first potential issue is an issue with the "suppress blank section" option being checked. This issue is supposedly fixed with ADAPT01483793, being released someday with service pack 2 for CRVS2010.
The second potential issue is using shared variables. This issue is supposedly fixed with ADAPT01484308, also targeted for SP2.
Our quote report does not explicitly use shared variables with any of the subreports, but it does have several subreports, each in its own section that has the "supress blank section" option checked. We have other reports that use this feature, as well, and they are not exhibiting the problem.
One different thing about the quote report is that it has a section with multiple suppression options selected. The section has a conditional suppression formula, which controls whether the section is included at all within the report. The section also has the suppress blank section option selected. There are multiple fields within the report that are each conditionally suppressed. In theory, the section's suppress formula could evaluate to true, yet all of the fields within the section are suppressed (due to null values), and then the "suppress blank section" option would kick in.
The missing data only seems to happen when the section is not being suppressed, and at least one of the fields is being included in the report. If I clear the "suppress blank section" check box, and change the section formula to also include the rules applied to the fields in the section, the missing data problem seems to be resolved.
Is this related to ADAPT01483793? Will it be fixed in service pack 2?
If more details are needed, I would be happy to provide a sample report with stored data.Hi Don,
Have a look at the Record Selection formula in CR Designer ( stand alone ) and when exported to RPT format opening that report in the Designer also.
There's been a few issues with => logic in the record selection formula. It could be you are running into this problem. Look for NOT inserted into your selection formula.
Oh and SP2 is coming out shortly so it may resolve the issue. But if you want you could purchase a support, or if you have a support contract then create a case in SMP and get a rep to work with you to debug the issue.
If you have not try the Trial Version of CR 2011, put it on a VM-ware image or Test PC so you don't corrupt anything for production and have a look at and test it in that designer also. If you purchase a case and it is a bug then you'll get a credit back for the case.
Don
Edited by: Don Williams on Oct 26, 2011 7:40 AM -
Getting an error while viewing a data in owb
Hi,
I created an External table based on a flat file. But I am not able to view the data in the external table while the flat file has the data.
I am using this external table as a source in my mapping but no data is getting loaded even after the successful execution of the mapping.I am getting the following error while viewing the data
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-29400: data cartridge error
KUP-04040: file ATLAS in ATLAS_LOCATION1 not found
Please help me regarding this. I am new to the owb 11g.Thanks for you reply. Still I can't view the data. Actually what i did is imported a .csv file using files then created a external table configured , validated and deployed.In next step created a mapping with this excel this using table operator validated and genrated code. In next step using the control center manager i deployed it. But i can't view the data still in the external table.
Is this procedure is correct for loading the excel sheet in to databse. My client environment is on windows xp and database is on linux environment.
Please help me regarding this.I am new to owb. -
Can't view my data on my table in EM
Can anyone tell me why I am getting the error below after executing my control file and also why I can't view that data which was loaded.
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Jul 12 16:28:57 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: workattribute2.ctl
Data File: workattribute.dat
Bad File: workattribute.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table WORKATTRIBUTE, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
RESUMEID FIRST * CHARACTER
Terminator string : 'x'09''
WORKID NEXT * CHARACTER
Terminator string : 'x'09''
ID NEXT * CHARACTER
Terminator string : 'x'09''
TYPE NEXT * CHARACTER
Terminator string : 'x'09''
Record 1: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 2: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 3: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 4: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 5: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 6: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 7: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 8: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 9: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 10: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 11: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 12: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 13: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 14: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 15: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 16: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 17: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 18: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 19: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 20: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 21: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 22: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 23: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 24: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 25: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 26: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 27: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 28: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 29: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 30: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 31: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 32: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 33: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 34: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 35: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 36: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 37: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 38: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 39: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 40: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 41: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 42: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 43: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 44: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 45: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 46: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 47: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 48: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 49: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 50: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 51: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table WORKATTRIBUTE:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 66048 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 51
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Thu Jul 12 16:28:57 2007
Run ended on Thu Jul 12 16:29:05 2007
Elapsed time was: 00:00:08.09
CPU time was: 00:00:00.12
thank you in advanceHi User572297,
This is the wrong forum for a SQL Loader Q, not a bad thing, you just won't get as many responses.
That said, it seems either your terminator string is not present, incorrect or you are simply trying to load to many characters into a column that is too small. For ex: trying to load 50 characters into a Varchar2(40). You get this error for every row, hence you can not see any of the data, becuase none of the rows loaded. See below:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
What would help is to know the datatype and size of the RESUMEID column AND to see the first lines of your workattribute.dat file. -
Can't view changed data in journal data
Hi,
I have implemented JKM Oracle 10g Consistent Logminer on Oracle 10g with the following option.
- Asynchronous_mode : yes
- Auto_configuration : yes
1. Change Data Capture -> Add to CDC, 2.Subscriber->subscribe (sunopsis),
3. Start Journal
The journal has been started correctly wothout errors. The journalized table has always the symbol "green clock". All is gook working.
And then i inserted 1 record in source table, but i can't view changed data in journal data. I can't understand why journal data was generated.
There are no errors.
Help me !!!Did your designer was on the good context ?
Look the list box at the top right of the Designer interface.
You must have the same as the one where you define your journalization. -
Insert old missing data from one table to another(databaase trigger)
Hello,
i want to do two things
1)I want to insert old missing data from one table to another through a database trigger but it can't be executed that way i don't know what should i do in case of replacing old data in table_1 into table_2
2)what should i use :NEW. OR :OLD. instead.
3) what should i do if i have records exising between the two dates
i want to surpress the existing records.
the following code is what i have but no effect occured.
CREATE OR REPLACE TRIGGER ATTENDANCEE_FOLLOWS
AFTER INSERT ON ACCESSLOG
REFERENCING NEW AS NEW OLD AS OLD
FOR EACH ROW
DECLARE
V_COUNT NUMBER(2);
V_TIME_OUT DATE;
V_DATE_IN DATE;
V_DATE_OUT DATE;
V_TIME_IN DATE;
V_ATT_FLAG VARCHAR2(3);
V_EMP_ID NUMBER(11);
CURSOR EMP_FOLLOWS IS
SELECT EMPLOYEEID , LOGDATE , LOGTIME , INOUT
FROM ACCESSLOG
WHERE LOGDATE
BETWEEN TO_DATE('18/12/2008','dd/mm/rrrr')
AND TO_DATE('19/12/2008','dd/mm/rrrr');
BEGIN
FOR EMP IN EMP_FOLLOWS LOOP
SELECT COUNT(*)
INTO V_COUNT
FROM EMP_ATTENDANCEE
WHERE EMP_ID = EMP.EMPLOYEEID
AND DATE_IN = EMP.LOGDATE
AND ATT_FLAG = 'I';
IF V_COUNT = 0 THEN
INSERT INTO EMP_ATTENDANCEE (EMP_ID, DATE_IN ,DATE_OUT
,TIME_IN ,TIME_OUT,ATT_FLAG)
VALUES (TO_NUMBER(TO_CHAR(:NEW.employeeid,99999)),
TO_DATE(:NEW.LOGDATE,'dd/mm/rrrr'), -- DATE_IN
NULL,
TO_DATE(:NEW.LOGTIME,'HH24:MI:SS'), -- TIME_IN
NULL ,'I');
ELSIF V_COUNT > 0 THEN
UPDATE EMP_ATTENDANCEE
SET DATE_OUT = TO_DATE(:NEW.LOGDATE,'dd/mm/rrrr'), -- DATE_OUT,
TIME_OUT = TO_DATE(:NEW.LOGTIME,'HH24:MI:SS'), -- TIME_OUT
ATT_FLAG = 'O'
WHERE EMP_ID = TO_NUMBER(TO_CHAR(:NEW.employeeid,99999))
AND DATE_IN <= (SELECT MAX (DATE_IN )
FROM EMP_ATTENDANCEE
WHERE EMP_ID = TO_NUMBER(TO_CHAR(:NEW.employeeid,99999))
AND DATE_OUT IS NULL
AND TIME_OUT IS NULL )
AND DATE_OUT IS NULL
AND TIME_OUT IS NULL ;
END IF;
END LOOP;
EXCEPTION
WHEN OTHERS THEN RAISE;
END ATTENDANCEE_FOLLOWS ;
Regards,
Abdetu..INSERT INTO SALES_MASTER
( NO
, Name
, PINCODE )
SELECT SALESMANNO
, SALESMANNAME
, PINCODE
FROM SALESMAN_MASTER;Regards,
Christian Balz
Maybe you are looking for
-
Suddenly my main email account is not showing the emails I've kept in line...the other accounts haven't changed at all. I changed no settings, checked everything, no tags are in place. I shut down the program and opened it and nothing. I need those m
-
Emailed note, but phone sent the old version. help please!
I emailed a note to myself but it emailed an old version, and now the newer version has disappeared and isn't available on my iphone. How can I recover the newer note?
-
Question On Forms and my CGI..
Okay.. Im new with Dreamweaver... Im using Dreamweaver MX.. and I made a few forms.. my hosting is actually through Godaddy.com... anyways it has a "CGI Manager" on it.. when I open it up it shows me a few things like Perl,PHP,Processes,Error's. I do
-
sample my table test: create table test ( idsearch number, idtest number, p_nome varchar2(50), p_value varchar2(50), primary key(idsearch,idtest) insert into test values(1,1,'name','alex'); insert into test values(1,2,'age','
-
after i plug my ipod at my windows xp desktop it start automatic update on my itune after downloading the update it start detecting showing new devices is were detected from usb,then the download of the i tune stop responding and it show error can'no