Updated from .32 to 1.1.0 to 1.1.2 to 1.2.0 - User Login Hist page blank
-
Current entry is:
define(‘DATABASE_HOST’, ‘localhost’);Changing to:
define(‘DATABASE_HOST’, ‘p:127.0.0.1’);Restarting FOG server…
No dice. Blank page…
-
[quote=“UpACreek, post: 38550, member: 24526”]I upped mine to 1536M just to see. I still could not get my login history to load. :([/quote]
I’ve tried the same. I run fog in a ESXi VM. It used to run on a total of 512mb ram and 1 cpu just fine, but trying to resolve this I’ve increased to 2gb ram and Dual CPU. I have 1.72 million records in my user tracking table, going back to the 12-31-2013. I used to have record back to 2011 I think but I purged those out, at least 4 million or so that I deleted.
Looking at the table I wonder if there is a way to ignore a user from the logs. We have a few automatic logon labs that restart frequently. I would like to keep tracking enabled on the machine to log any bypasses of the autologon, but for now I’m disabling tracking on those computers.
-
I’m taking a quick look at the way php and mysql is handled. It shouldn’t be a problem no matter the number of results, but php defaults to a buffered set which should only be used in a case were we must know the number of results or we know it’s going to be limited, unbuffered should allow much large return sets. I’m testing.
-
I’m game to try other ideas. Just let me know. We’re fumbling around quite a bit without this report. You don’t realize how important a feature is, until it suddenly isn’t available.
-
Take a look at the current svn.
You can update your memory_limit right in fog gui.
I’m also taking an approach to use MYSQLI_USE_RESULT for the results page to allow, hopefully, a better size in the variable that’s stored, though it’s not likely to fix all issues.
That said, you no longer need to, nor will it affect fog anymore, set the memory_limit in the php.ini file as it’s set in FOG and uses that value specifically. It defaults to 128 and will never go lower than that. Nice part is you don’t need to restart apache either for the changes to take effect.
-
I’ll give it a go. crosses fingers I’ll let you know.
-
Bummer… still a blank page after going to Login History.
Even tried changing the memory in the GUI to 1024, but still no dice. -
How large is your userTracking Table in mysql?
-
[quote=“Wolfbane8653, post: 38679, member: 3362”]How large is your userTracking Table in mysql?[/quote]
Login records span from 2012 to now, and there are ~614,000 records just in the login history table.
-
SVN 2705 released, should fix the memory problem for even the simplest of sortable data.
Alright,
SVN should contain the fix for this pressing problem. While I can’t fix huge lists and “Hosts and Users” will still be broken as it’s pulling all the data at once, it should work for all the other “timed” reports as my method was all screwed up. I had to add a new element to the find method to get it to work, but it seems to work properly now.
Basically (and you all can slap me if you want) as this wasn’t added before, my mechanism for sorting out the proper datasets was to pull the entire set of entries. This was/is where the problem exists. I’ve added a new feature to allow me to adjust the find parameter’s Compare element. Because of this, I can now test anything I want so long as I know the syntax. It’s intended purpose, for now, is to simply use BETWEEN ‘$date1’ AND ‘$date2’ calls, but it has potential for much more down the road.
Please test and let me know.
-
Thank you for your continued work on this! Will test after Thanksgiving. gobble gobble
Happy Thanksgiving everyone!
-
I’d love to report positive results… But…
Both ‘Login History’ and ‘Hosts and Users’ result in a blank page after updating to SVN 2721. FOG_MEMORY_LIMIT is set to 1024.
-
Or you simplifying the dates, or trying to grab all of them?
-
I can’t even do a dates query. I click on either link, and it blanks out. I don’t get to a page where it will even begin to allow me to specify a query.
-
This release broke other functions as well.
Trying to list all images, results in a blank page.
Trying to search an image results in this:[url=“/_imported_xf_attachments/1/1526_Uhoh.png?:”]Uhoh.png[/url]
-
Hello,
Is there any advance on this problem?
I’m running 1.2.0 on a SuSe platform.I haven’t discovered any problems with reports but I suffer the same problem as described above if I try to list the images or do a search.
Clicking on ‘List All Images’ results in a blank page (page source is blank too - suggests that nothing is rendered)
‘New Search’ shows a page and if I enter a random string, I get ‘0 results found’ which is to be expected. If I start to enter the name of a valid image, I get ‘Internal Server Error’ as UpACreek does.Thanks in advance for any help.
Alasdair -
[quote=“Alasdair Hatfield, post: 40435, member: 24780”]Hello,
Is there any advance on this problem?
I’m running 1.2.0 on a SuSe platform.I haven’t discovered any problems with reports but I suffer the same problem as described above if I try to list the images or do a search.
Clicking on ‘List All Images’ results in a blank page (page source is blank too - suggests that nothing is rendered)
‘New Search’ shows a page and if I enter a random string, I get ‘0 results found’ which is to be expected. If I start to enter the name of a valid image, I get ‘Internal Server Error’ as UpACreek does.Thanks in advance for any help.
Alasdair[/quote]
Did you try any of the suggestions as described throughout this thread? I don’t know what’s causing the issue on images but it sounds like either you have a ton of images or there’s an image or two that are not holding the proper I’d for whatever reason. On to that when you get the blank page what is in your error log? -
Hello Tom,
Thanks for the fast reply.
The memory suggestions in the thread seemed unlikely to me as I am currently only dealing with two images. Correct me if this assumption is wrong.
This is what I did -
allow PC (host) to create itself as a host on the server
create an image for the host
link the image to the host
initiate a task to upload the image
NB - this does not complete - stars appear on the host but the server doesn’t end the task. So, I move the image form dev/ to images/ with the assigned name of the image. I don’t change any rights.If I then check the image list - problem
However, if I initiate a download, all seems to run smoothly.
I have looked at the log files under /opt/fog but they don’t appear to have been touched since my last attempts a month ago.
There are 4 logs - fogscheduler, fogreplicator, groupmanager, multicast
Am I looking in the wrong place or have I managed to kill logging somewhere along the line?Best
Alasdair -
Here is the images table from the db if it gives any clues
[url=“/_imported_xf_attachments/1/1590_images_table.txt?:”]images_table.txt[/url]
-
So I’m not sure where the problem lies or where the error logs are found on SuSE Linux. Maybe in /var/log/httpd or /var/log/apache
This issues based on what you’re telling me appear related to ftp issues though.